Jan 20 18:30:06 crc systemd[1]: Starting Kubernetes Kubelet... Jan 20 18:30:06 crc restorecon[4746]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:06 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 18:30:07 crc restorecon[4746]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 20 18:30:07 crc restorecon[4746]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 20 18:30:07 crc kubenswrapper[4773]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 20 18:30:07 crc kubenswrapper[4773]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 20 18:30:07 crc kubenswrapper[4773]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 20 18:30:07 crc kubenswrapper[4773]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 20 18:30:07 crc kubenswrapper[4773]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 20 18:30:07 crc kubenswrapper[4773]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.226299 4773 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230138 4773 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230407 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230412 4773 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230416 4773 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230420 4773 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230423 4773 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230428 4773 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230433 4773 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230437 4773 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230441 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230444 4773 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230449 4773 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230454 4773 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230458 4773 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230462 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230465 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230469 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230534 4773 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230679 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230685 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230689 4773 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230692 4773 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230696 4773 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230700 4773 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230704 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.230709 4773 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231419 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231436 4773 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231448 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231461 4773 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231474 4773 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231502 4773 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231512 4773 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231525 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231536 4773 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231548 4773 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231559 4773 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231569 4773 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231593 4773 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231603 4773 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231614 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231624 4773 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231644 4773 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231654 4773 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231666 4773 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231701 4773 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231713 4773 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231718 4773 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231724 4773 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231734 4773 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231740 4773 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231745 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231750 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231754 4773 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231759 4773 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231763 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231766 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231771 4773 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231777 4773 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231782 4773 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231786 4773 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231794 4773 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231800 4773 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231805 4773 feature_gate.go:330] unrecognized feature gate: Example Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231809 4773 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231813 4773 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231817 4773 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231821 4773 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231826 4773 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231830 4773 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.231834 4773 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.232688 4773 flags.go:64] FLAG: --address="0.0.0.0" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.232786 4773 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.232814 4773 flags.go:64] FLAG: --anonymous-auth="true" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.232845 4773 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.232890 4773 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.232907 4773 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.232962 4773 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.232981 4773 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.232994 4773 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233009 4773 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233024 4773 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233079 4773 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233093 4773 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233106 4773 flags.go:64] FLAG: --cgroup-root="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233120 4773 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233134 4773 flags.go:64] FLAG: --client-ca-file="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233147 4773 flags.go:64] FLAG: --cloud-config="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233159 4773 flags.go:64] FLAG: --cloud-provider="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233171 4773 flags.go:64] FLAG: --cluster-dns="[]" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233189 4773 flags.go:64] FLAG: --cluster-domain="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233202 4773 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233216 4773 flags.go:64] FLAG: --config-dir="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233229 4773 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233242 4773 flags.go:64] FLAG: --container-log-max-files="5" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233258 4773 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233272 4773 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233285 4773 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233298 4773 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233310 4773 flags.go:64] FLAG: --contention-profiling="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233376 4773 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233393 4773 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233408 4773 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233421 4773 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233440 4773 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233453 4773 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233467 4773 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233479 4773 flags.go:64] FLAG: --enable-load-reader="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233492 4773 flags.go:64] FLAG: --enable-server="true" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233503 4773 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233523 4773 flags.go:64] FLAG: --event-burst="100" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233536 4773 flags.go:64] FLAG: --event-qps="50" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233552 4773 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233565 4773 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233578 4773 flags.go:64] FLAG: --eviction-hard="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233594 4773 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233607 4773 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233619 4773 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233632 4773 flags.go:64] FLAG: --eviction-soft="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233644 4773 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233656 4773 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233669 4773 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233683 4773 flags.go:64] FLAG: --experimental-mounter-path="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233695 4773 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233707 4773 flags.go:64] FLAG: --fail-swap-on="true" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233719 4773 flags.go:64] FLAG: --feature-gates="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233735 4773 flags.go:64] FLAG: --file-check-frequency="20s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233749 4773 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233763 4773 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233776 4773 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233789 4773 flags.go:64] FLAG: --healthz-port="10248" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233806 4773 flags.go:64] FLAG: --help="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233820 4773 flags.go:64] FLAG: --hostname-override="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233834 4773 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233847 4773 flags.go:64] FLAG: --http-check-frequency="20s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233860 4773 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233872 4773 flags.go:64] FLAG: --image-credential-provider-config="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233882 4773 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233892 4773 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233902 4773 flags.go:64] FLAG: --image-service-endpoint="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233912 4773 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233921 4773 flags.go:64] FLAG: --kube-api-burst="100" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233974 4773 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.233989 4773 flags.go:64] FLAG: --kube-api-qps="50" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234001 4773 flags.go:64] FLAG: --kube-reserved="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234015 4773 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234027 4773 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234040 4773 flags.go:64] FLAG: --kubelet-cgroups="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234053 4773 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234066 4773 flags.go:64] FLAG: --lock-file="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234078 4773 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234092 4773 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234102 4773 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234134 4773 flags.go:64] FLAG: --log-json-split-stream="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234144 4773 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234154 4773 flags.go:64] FLAG: --log-text-split-stream="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234164 4773 flags.go:64] FLAG: --logging-format="text" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234174 4773 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234184 4773 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234197 4773 flags.go:64] FLAG: --manifest-url="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234210 4773 flags.go:64] FLAG: --manifest-url-header="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234228 4773 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234241 4773 flags.go:64] FLAG: --max-open-files="1000000" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234257 4773 flags.go:64] FLAG: --max-pods="110" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234273 4773 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234286 4773 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234299 4773 flags.go:64] FLAG: --memory-manager-policy="None" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234312 4773 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234325 4773 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234339 4773 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234353 4773 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234394 4773 flags.go:64] FLAG: --node-status-max-images="50" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234408 4773 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234422 4773 flags.go:64] FLAG: --oom-score-adj="-999" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234435 4773 flags.go:64] FLAG: --pod-cidr="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234448 4773 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234467 4773 flags.go:64] FLAG: --pod-manifest-path="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234480 4773 flags.go:64] FLAG: --pod-max-pids="-1" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234493 4773 flags.go:64] FLAG: --pods-per-core="0" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234506 4773 flags.go:64] FLAG: --port="10250" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234519 4773 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234532 4773 flags.go:64] FLAG: --provider-id="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234545 4773 flags.go:64] FLAG: --qos-reserved="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234558 4773 flags.go:64] FLAG: --read-only-port="10255" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234572 4773 flags.go:64] FLAG: --register-node="true" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234585 4773 flags.go:64] FLAG: --register-schedulable="true" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234598 4773 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234621 4773 flags.go:64] FLAG: --registry-burst="10" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234634 4773 flags.go:64] FLAG: --registry-qps="5" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234647 4773 flags.go:64] FLAG: --reserved-cpus="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234662 4773 flags.go:64] FLAG: --reserved-memory="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234679 4773 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234691 4773 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234704 4773 flags.go:64] FLAG: --rotate-certificates="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234716 4773 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234728 4773 flags.go:64] FLAG: --runonce="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234741 4773 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234754 4773 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234767 4773 flags.go:64] FLAG: --seccomp-default="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234780 4773 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234792 4773 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234805 4773 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234819 4773 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234832 4773 flags.go:64] FLAG: --storage-driver-password="root" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234845 4773 flags.go:64] FLAG: --storage-driver-secure="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234858 4773 flags.go:64] FLAG: --storage-driver-table="stats" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234870 4773 flags.go:64] FLAG: --storage-driver-user="root" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234887 4773 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234901 4773 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234914 4773 flags.go:64] FLAG: --system-cgroups="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234969 4773 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.234995 4773 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.235008 4773 flags.go:64] FLAG: --tls-cert-file="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.235021 4773 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.235037 4773 flags.go:64] FLAG: --tls-min-version="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.235051 4773 flags.go:64] FLAG: --tls-private-key-file="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.235063 4773 flags.go:64] FLAG: --topology-manager-policy="none" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.235076 4773 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.235088 4773 flags.go:64] FLAG: --topology-manager-scope="container" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.235102 4773 flags.go:64] FLAG: --v="2" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.235120 4773 flags.go:64] FLAG: --version="false" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.235150 4773 flags.go:64] FLAG: --vmodule="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.235167 4773 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.235181 4773 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235460 4773 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235476 4773 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235487 4773 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235497 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235507 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235519 4773 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235530 4773 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235541 4773 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235552 4773 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235563 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235574 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235602 4773 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235615 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235625 4773 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235635 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235645 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235656 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235666 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235677 4773 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235690 4773 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235718 4773 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235729 4773 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235740 4773 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235752 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235763 4773 feature_gate.go:330] unrecognized feature gate: Example Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235774 4773 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235784 4773 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235795 4773 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235806 4773 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235816 4773 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235828 4773 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235841 4773 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235851 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235863 4773 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235873 4773 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235884 4773 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235894 4773 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235904 4773 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235951 4773 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235961 4773 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235969 4773 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235978 4773 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235986 4773 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.235994 4773 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236003 4773 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236016 4773 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236027 4773 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236037 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236045 4773 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236057 4773 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236068 4773 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236077 4773 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236088 4773 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236100 4773 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236109 4773 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236120 4773 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236129 4773 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236140 4773 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236151 4773 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236162 4773 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236173 4773 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236182 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236192 4773 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236202 4773 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236211 4773 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236219 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236228 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236236 4773 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236244 4773 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236253 4773 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.236261 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.236294 4773 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.247702 4773 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.247767 4773 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.247909 4773 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.247925 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.247978 4773 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.247991 4773 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248002 4773 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248013 4773 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248022 4773 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248030 4773 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248041 4773 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248057 4773 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248070 4773 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248082 4773 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248093 4773 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248104 4773 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248118 4773 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248129 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248139 4773 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248147 4773 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248155 4773 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248166 4773 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248176 4773 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248186 4773 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248196 4773 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248206 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248216 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248226 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248236 4773 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248245 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248255 4773 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248265 4773 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248275 4773 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248285 4773 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248295 4773 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248305 4773 feature_gate.go:330] unrecognized feature gate: Example Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248320 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248334 4773 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248347 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248357 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248367 4773 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248376 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248387 4773 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248396 4773 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248406 4773 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248416 4773 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248427 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248437 4773 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248447 4773 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248457 4773 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248468 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248478 4773 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248488 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248498 4773 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248508 4773 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248519 4773 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248529 4773 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248539 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248550 4773 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248560 4773 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248570 4773 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248580 4773 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248590 4773 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248600 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248609 4773 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248620 4773 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248630 4773 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248639 4773 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248650 4773 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248659 4773 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248669 4773 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248679 4773 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.248690 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.248704 4773 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249038 4773 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249056 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249066 4773 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249075 4773 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249085 4773 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249093 4773 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249101 4773 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249109 4773 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249117 4773 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249128 4773 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249138 4773 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249149 4773 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249160 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249170 4773 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249179 4773 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249189 4773 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249199 4773 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249207 4773 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249216 4773 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249224 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249233 4773 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249241 4773 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249249 4773 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249257 4773 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249265 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249273 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249281 4773 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249288 4773 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249296 4773 feature_gate.go:330] unrecognized feature gate: Example Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249304 4773 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249314 4773 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249324 4773 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249333 4773 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249341 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249350 4773 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249359 4773 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249367 4773 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249375 4773 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249383 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249391 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249399 4773 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249407 4773 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249420 4773 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249427 4773 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249435 4773 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249443 4773 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249451 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249459 4773 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249468 4773 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249478 4773 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249497 4773 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249511 4773 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249521 4773 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249531 4773 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249540 4773 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249551 4773 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249560 4773 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249569 4773 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249578 4773 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249588 4773 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249597 4773 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249607 4773 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249617 4773 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249625 4773 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249633 4773 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249640 4773 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249648 4773 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249656 4773 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249664 4773 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249672 4773 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.249682 4773 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.249697 4773 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.250296 4773 server.go:940] "Client rotation is on, will bootstrap in background" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.254649 4773 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.254787 4773 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.255633 4773 server.go:997] "Starting client certificate rotation" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.255669 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.256030 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-15 14:15:46.233543269 +0000 UTC Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.256110 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.264036 4773 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.265343 4773 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.266859 4773 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.279736 4773 log.go:25] "Validated CRI v1 runtime API" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.304594 4773 log.go:25] "Validated CRI v1 image API" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.306667 4773 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.309725 4773 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-20-18-25-41-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.309787 4773 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.339800 4773 manager.go:217] Machine: {Timestamp:2026-01-20 18:30:07.337432553 +0000 UTC m=+0.259245647 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:3435f284-a40d-4f32-a1fa-55cd3339f30e BootID:0ac96020-64a6-43b4-8bf4-975de5898510 Filesystems:[{Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:54:e4:e8 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:54:e4:e8 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:8a:15:d7 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:18:37:9f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:c3:22:ee Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:a4:7d:ff Speed:-1 Mtu:1496} {Name:ens7.23 MacAddress:52:54:00:f5:c2:70 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:be:92:14:47:cd:df Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:8a:c3:bc:46:41:f4 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.340251 4773 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.340478 4773 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.341285 4773 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.341622 4773 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.341691 4773 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.342193 4773 topology_manager.go:138] "Creating topology manager with none policy" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.342214 4773 container_manager_linux.go:303] "Creating device plugin manager" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.342495 4773 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.342547 4773 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.342991 4773 state_mem.go:36] "Initialized new in-memory state store" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.343170 4773 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.344620 4773 kubelet.go:418] "Attempting to sync node with API server" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.344662 4773 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.344711 4773 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.344739 4773 kubelet.go:324] "Adding apiserver pod source" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.344819 4773 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.347239 4773 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.347835 4773 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.348805 4773 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349405 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349435 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349445 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349455 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349471 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349480 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349490 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349505 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349517 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349527 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349540 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349549 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.349774 4773 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.350319 4773 server.go:1280] "Started kubelet" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.352181 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:07 crc systemd[1]: Started Kubernetes Kubelet. Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.352518 4773 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.352535 4773 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.353402 4773 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.354371 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.354448 4773 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.354493 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-14 07:56:53.176200161 +0000 UTC Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.354622 4773 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.354631 4773 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.354721 4773 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.354826 4773 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.375098 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.375178 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.375091 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="200ms" Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.375283 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.375349 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.377489 4773 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.377560 4773 factory.go:55] Registering systemd factory Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.377582 4773 factory.go:221] Registration of the systemd container factory successfully Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.378025 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.378221 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.376673 4773 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.39:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188c83ecf24addc4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 18:30:07.350275524 +0000 UTC m=+0.272088558,LastTimestamp:2026-01-20 18:30:07.350275524 +0000 UTC m=+0.272088558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.380093 4773 server.go:460] "Adding debug handlers to kubelet server" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.380852 4773 factory.go:153] Registering CRI-O factory Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.380958 4773 factory.go:221] Registration of the crio container factory successfully Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.381040 4773 factory.go:103] Registering Raw factory Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.381106 4773 manager.go:1196] Started watching for new ooms in manager Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.382795 4773 manager.go:319] Starting recovery of all containers Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390674 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390768 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390799 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390825 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390853 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390877 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390905 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390962 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.390994 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391022 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391048 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391078 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391107 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391138 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391166 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391193 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391318 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391349 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391376 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391424 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391452 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391480 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391507 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391534 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391560 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391587 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391621 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391659 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391694 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391720 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391768 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391796 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391822 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391850 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391875 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391898 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391924 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.391982 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392007 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392031 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392056 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392082 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392107 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392133 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392160 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392186 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392213 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392242 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392266 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392291 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392317 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392344 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392379 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392406 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392434 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392465 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392492 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392519 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392546 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392572 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392597 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392621 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392646 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392672 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392697 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392727 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392752 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392809 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392836 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392865 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392890 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.392915 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393090 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393129 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393157 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393187 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393215 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393240 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393266 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393294 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393319 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393343 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393435 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393463 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393489 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393516 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393542 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393569 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393596 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393624 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393649 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393675 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393705 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393730 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393756 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393784 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393809 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393835 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393859 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.393884 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.394831 4773 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.394897 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.394961 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.394992 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395019 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395059 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395090 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395184 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395228 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395258 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395286 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395316 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395343 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395369 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395397 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395424 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395448 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395474 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395498 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395523 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395549 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395576 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395602 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395627 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395653 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395680 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395706 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395735 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395761 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395786 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395811 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395836 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395863 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395887 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395912 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.395975 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396003 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396028 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396058 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396083 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396113 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396140 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396164 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396192 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396218 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396243 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396268 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396290 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396315 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396340 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396365 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396390 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396418 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396445 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396471 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396500 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396524 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396549 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396573 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396599 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396626 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396651 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396677 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396702 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396728 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396753 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396777 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396802 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396830 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396857 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396882 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396910 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396971 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.396998 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397024 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397051 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397076 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397100 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397124 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397155 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397183 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397210 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397239 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397266 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397292 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397316 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397341 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397365 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397391 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397414 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397441 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397469 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397495 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397520 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397547 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397574 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397597 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397620 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397649 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397674 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397699 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397725 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397751 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397774 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397797 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397822 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397847 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397875 4773 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397899 4773 reconstruct.go:97] "Volume reconstruction finished" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.397917 4773 reconciler.go:26] "Reconciler: start to sync state" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.415196 4773 manager.go:324] Recovery completed Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.425136 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.427746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.427812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.427847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.428762 4773 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.428792 4773 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.428825 4773 state_mem.go:36] "Initialized new in-memory state store" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.442348 4773 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.445684 4773 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.445747 4773 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.445785 4773 kubelet.go:2335] "Starting kubelet main sync loop" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.445857 4773 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 20 18:30:07 crc kubenswrapper[4773]: W0120 18:30:07.447019 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.447117 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.455047 4773 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.543117 4773 policy_none.go:49] "None policy: Start" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.544801 4773 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.544830 4773 state_mem.go:35] "Initializing new in-memory state store" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.546425 4773 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.555186 4773 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.576401 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="400ms" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.617023 4773 manager.go:334] "Starting Device Plugin manager" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.617251 4773 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.617281 4773 server.go:79] "Starting device plugin registration server" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.617849 4773 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.617877 4773 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.618166 4773 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.618294 4773 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.618308 4773 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.636438 4773 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.718335 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.719787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.719836 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.719847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.719877 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.720430 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.746711 4773 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc","openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.746846 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.748094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.748119 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.748128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.748226 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.748567 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.748594 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.749137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.749155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.749163 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.749242 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.749630 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.749658 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.750325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.750346 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.750356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.750437 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.750747 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.750769 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751322 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751793 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.751939 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.752162 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.752237 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.752859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.752889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.752900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753204 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753278 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753554 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.753585 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.754401 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.754431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.754443 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.802904 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803000 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803034 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803069 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803138 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803182 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803213 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803244 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803265 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803338 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803401 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803426 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803449 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803514 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.803545 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.904684 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.904828 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.904870 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.904904 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.904997 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905031 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905064 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905098 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905101 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905186 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905219 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905134 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905136 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905191 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905126 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905048 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905319 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905385 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905343 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905294 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905475 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905523 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905569 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905609 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905665 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905709 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905729 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905772 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905772 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.905921 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.920676 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.922292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.922348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.922368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:07 crc kubenswrapper[4773]: I0120 18:30:07.922409 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.923073 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Jan 20 18:30:07 crc kubenswrapper[4773]: E0120 18:30:07.978334 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="800ms" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.079195 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.101649 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.115519 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-572f120883df79b88c9a03771f9a45f9e819d74343812d8cc501879c97dd03f9 WatchSource:0}: Error finding container 572f120883df79b88c9a03771f9a45f9e819d74343812d8cc501879c97dd03f9: Status 404 returned error can't find the container with id 572f120883df79b88c9a03771f9a45f9e819d74343812d8cc501879c97dd03f9 Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.116857 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.123260 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-be7dbf3ce998582a1427d2962be052de94e941e207347f47e0ad82768a201508 WatchSource:0}: Error finding container be7dbf3ce998582a1427d2962be052de94e941e207347f47e0ad82768a201508: Status 404 returned error can't find the container with id be7dbf3ce998582a1427d2962be052de94e941e207347f47e0ad82768a201508 Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.124503 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.133336 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.139493 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-495b390ee7050af340df6ba6dd2ef38ed2b01109f925c91fb0986e63d07f8c10 WatchSource:0}: Error finding container 495b390ee7050af340df6ba6dd2ef38ed2b01109f925c91fb0986e63d07f8c10: Status 404 returned error can't find the container with id 495b390ee7050af340df6ba6dd2ef38ed2b01109f925c91fb0986e63d07f8c10 Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.156580 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-8be970e5c4b390912a7d6dccddc40c2dcc075837d9896820f2decbb38e6dfe25 WatchSource:0}: Error finding container 8be970e5c4b390912a7d6dccddc40c2dcc075837d9896820f2decbb38e6dfe25: Status 404 returned error can't find the container with id 8be970e5c4b390912a7d6dccddc40c2dcc075837d9896820f2decbb38e6dfe25 Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.163480 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-ce9fff300cded6700517f9c9e95aa1c9fcdaaf9045c0979e3d9523ee0f30d249 WatchSource:0}: Error finding container ce9fff300cded6700517f9c9e95aa1c9fcdaaf9045c0979e3d9523ee0f30d249: Status 404 returned error can't find the container with id ce9fff300cded6700517f9c9e95aa1c9fcdaaf9045c0979e3d9523ee0f30d249 Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.304892 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:08 crc kubenswrapper[4773]: E0120 18:30:08.304994 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.323754 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.325190 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.325236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.325249 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.325280 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:08 crc kubenswrapper[4773]: E0120 18:30:08.325692 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.353576 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.354602 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 08:26:34.395298313 +0000 UTC Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.449568 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"be7dbf3ce998582a1427d2962be052de94e941e207347f47e0ad82768a201508"} Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.451004 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"572f120883df79b88c9a03771f9a45f9e819d74343812d8cc501879c97dd03f9"} Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.454523 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ce9fff300cded6700517f9c9e95aa1c9fcdaaf9045c0979e3d9523ee0f30d249"} Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.458233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8be970e5c4b390912a7d6dccddc40c2dcc075837d9896820f2decbb38e6dfe25"} Jan 20 18:30:08 crc kubenswrapper[4773]: I0120 18:30:08.458996 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"495b390ee7050af340df6ba6dd2ef38ed2b01109f925c91fb0986e63d07f8c10"} Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.643195 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:08 crc kubenswrapper[4773]: E0120 18:30:08.643262 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.752463 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:08 crc kubenswrapper[4773]: E0120 18:30:08.752566 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:08 crc kubenswrapper[4773]: W0120 18:30:08.764688 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:08 crc kubenswrapper[4773]: E0120 18:30:08.764773 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:08 crc kubenswrapper[4773]: E0120 18:30:08.779734 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="1.6s" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.125765 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.127891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.127956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.127969 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.127997 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:09 crc kubenswrapper[4773]: E0120 18:30:09.128510 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.338402 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 20 18:30:09 crc kubenswrapper[4773]: E0120 18:30:09.340286 4773 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.353488 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.355583 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 17:38:31.360476695 +0000 UTC Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.465789 4773 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0" exitCode=0 Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.465885 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.466015 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.467962 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.468011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.468031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.468669 4773 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="a30a34145c90c9e4085aa6a6d5e9eb324c01b4b0ede8fb29198bef6137035672" exitCode=0 Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.468787 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"a30a34145c90c9e4085aa6a6d5e9eb324c01b4b0ede8fb29198bef6137035672"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.468984 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.470335 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.470374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.470393 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.470799 4773 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4" exitCode=0 Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.470903 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.470883 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.475501 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.475548 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.475564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.478699 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.478765 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.478778 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.478790 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.478735 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.479771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.479817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.479835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.480039 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554" exitCode=0 Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.480086 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554"} Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.480192 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.481033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.481159 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.481244 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.493793 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.495017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.495056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:09 crc kubenswrapper[4773]: I0120 18:30:09.495069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.353580 4773 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.356593 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 18:27:29.505038803 +0000 UTC Jan 20 18:30:10 crc kubenswrapper[4773]: E0120 18:30:10.380507 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="3.2s" Jan 20 18:30:10 crc kubenswrapper[4773]: E0120 18:30:10.433901 4773 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.39:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188c83ecf24addc4 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 18:30:07.350275524 +0000 UTC m=+0.272088558,LastTimestamp:2026-01-20 18:30:07.350275524 +0000 UTC m=+0.272088558,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.485422 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.485464 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.485476 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.485484 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.486975 4773 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae" exitCode=0 Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.487035 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.487081 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.487841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.487871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.487880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.488718 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cf6d2f31f31a43668f18b56290a5fc01f7f390dd941e4eb69323b5511b6d895f"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.488786 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.489713 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.489740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.489749 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.490792 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.490815 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.490828 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5"} Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.490833 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.491227 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.491412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.491436 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.491444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.497915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.497975 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.497992 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:10 crc kubenswrapper[4773]: W0120 18:30:10.711315 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.39:6443: connect: connection refused Jan 20 18:30:10 crc kubenswrapper[4773]: E0120 18:30:10.711396 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.39:6443: connect: connection refused" logger="UnhandledError" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.729730 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.731137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.731173 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.731183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:10 crc kubenswrapper[4773]: I0120 18:30:10.731204 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:10 crc kubenswrapper[4773]: E0120 18:30:10.731711 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.39:6443: connect: connection refused" node="crc" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.356779 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 10:32:25.209365633 +0000 UTC Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.495111 4773 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4" exitCode=0 Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.495169 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4"} Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.495192 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.495839 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.495865 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.495882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.499699 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.500248 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.501735 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f"} Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.501915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.501952 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.501971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.502118 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.502365 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.502418 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.502451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.502089 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.503753 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.503779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:11 crc kubenswrapper[4773]: I0120 18:30:11.503791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.223649 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.261989 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.357594 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 17:04:54.606780821 +0000 UTC Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506633 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403"} Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506697 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda"} Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506719 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0"} Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506736 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506751 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506736 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7"} Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.506865 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.507778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.507816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.507832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.507840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.507866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:12 crc kubenswrapper[4773]: I0120 18:30:12.507893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.358513 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 09:00:44.866752855 +0000 UTC Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.422318 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.520408 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda"} Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.520489 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.520498 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.521750 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.521777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.521789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.522565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.522664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.522695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.932165 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.934066 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.934131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.934156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:13 crc kubenswrapper[4773]: I0120 18:30:13.934196 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:14 crc kubenswrapper[4773]: I0120 18:30:14.358991 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 01:11:21.982905387 +0000 UTC Jan 20 18:30:14 crc kubenswrapper[4773]: I0120 18:30:14.523769 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:14 crc kubenswrapper[4773]: I0120 18:30:14.525193 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 20 18:30:14 crc kubenswrapper[4773]: I0120 18:30:14.525359 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:14 crc kubenswrapper[4773]: I0120 18:30:14.525407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:14 crc kubenswrapper[4773]: I0120 18:30:14.525424 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:15 crc kubenswrapper[4773]: I0120 18:30:15.359668 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 02:40:13.026350435 +0000 UTC Jan 20 18:30:15 crc kubenswrapper[4773]: I0120 18:30:15.527111 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:15 crc kubenswrapper[4773]: I0120 18:30:15.528462 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:15 crc kubenswrapper[4773]: I0120 18:30:15.528489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:15 crc kubenswrapper[4773]: I0120 18:30:15.528502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:16 crc kubenswrapper[4773]: I0120 18:30:16.359850 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 14:45:48.363307144 +0000 UTC Jan 20 18:30:16 crc kubenswrapper[4773]: I0120 18:30:16.755207 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:16 crc kubenswrapper[4773]: I0120 18:30:16.755563 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:16 crc kubenswrapper[4773]: I0120 18:30:16.757645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:16 crc kubenswrapper[4773]: I0120 18:30:16.757708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:16 crc kubenswrapper[4773]: I0120 18:30:16.757728 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.279568 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.360959 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 08:34:57.29075926 +0000 UTC Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.432287 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.440922 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.533461 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.534754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.534811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.534830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:17 crc kubenswrapper[4773]: E0120 18:30:17.637436 4773 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 20 18:30:17 crc kubenswrapper[4773]: I0120 18:30:17.838738 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:18 crc kubenswrapper[4773]: I0120 18:30:18.361274 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 23:02:04.199073323 +0000 UTC Jan 20 18:30:18 crc kubenswrapper[4773]: I0120 18:30:18.536209 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:18 crc kubenswrapper[4773]: I0120 18:30:18.538347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:18 crc kubenswrapper[4773]: I0120 18:30:18.538393 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:18 crc kubenswrapper[4773]: I0120 18:30:18.538415 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:18 crc kubenswrapper[4773]: I0120 18:30:18.545424 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.362358 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 07:37:55.966396415 +0000 UTC Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.539097 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.540018 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.540080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.540099 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.798435 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.798625 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.799784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.799819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:19 crc kubenswrapper[4773]: I0120 18:30:19.799829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.363112 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 23:58:19.076771697 +0000 UTC Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.541876 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.543114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.543143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.543152 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.838807 4773 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.838880 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 18:30:20 crc kubenswrapper[4773]: W0120 18:30:20.876749 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 20 18:30:20 crc kubenswrapper[4773]: I0120 18:30:20.876833 4773 trace.go:236] Trace[836220209]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 18:30:10.875) (total time: 10001ms): Jan 20 18:30:20 crc kubenswrapper[4773]: Trace[836220209]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:30:20.876) Jan 20 18:30:20 crc kubenswrapper[4773]: Trace[836220209]: [10.001571701s] [10.001571701s] END Jan 20 18:30:20 crc kubenswrapper[4773]: E0120 18:30:20.876853 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 20 18:30:21 crc kubenswrapper[4773]: W0120 18:30:21.002332 4773 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 20 18:30:21 crc kubenswrapper[4773]: I0120 18:30:21.002450 4773 trace.go:236] Trace[449536495]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 18:30:11.001) (total time: 10001ms): Jan 20 18:30:21 crc kubenswrapper[4773]: Trace[449536495]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (18:30:21.002) Jan 20 18:30:21 crc kubenswrapper[4773]: Trace[449536495]: [10.001245205s] [10.001245205s] END Jan 20 18:30:21 crc kubenswrapper[4773]: E0120 18:30:21.002476 4773 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 20 18:30:21 crc kubenswrapper[4773]: I0120 18:30:21.261731 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 20 18:30:21 crc kubenswrapper[4773]: I0120 18:30:21.261802 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 20 18:30:21 crc kubenswrapper[4773]: I0120 18:30:21.265628 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 20 18:30:21 crc kubenswrapper[4773]: I0120 18:30:21.265677 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 20 18:30:21 crc kubenswrapper[4773]: I0120 18:30:21.364002 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 21:02:43.023646126 +0000 UTC Jan 20 18:30:22 crc kubenswrapper[4773]: I0120 18:30:22.279661 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]log ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]etcd ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-startkubeinformers ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-apiserver-admission-initializer ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/openshift.io-api-request-count-filter ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/generic-apiserver-start-informers ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/priority-and-fairness-config-consumer ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/priority-and-fairness-filter ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/storage-object-count-tracker-hook ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-apiextensions-informers ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-apiextensions-controllers ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/crd-informer-synced ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-system-namespaces-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-cluster-authentication-info-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-legacy-token-tracking-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-service-ip-repair-controllers ok Jan 20 18:30:22 crc kubenswrapper[4773]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Jan 20 18:30:22 crc kubenswrapper[4773]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/priority-and-fairness-config-producer ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/bootstrap-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/start-kube-aggregator-informers ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-status-local-available-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-status-remote-available-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-registration-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-wait-for-first-sync ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-discovery-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/kube-apiserver-autoregistration ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]autoregister-completion ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-openapi-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: [+]poststarthook/apiservice-openapiv3-controller ok Jan 20 18:30:22 crc kubenswrapper[4773]: livez check failed Jan 20 18:30:22 crc kubenswrapper[4773]: I0120 18:30:22.279797 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:30:22 crc kubenswrapper[4773]: I0120 18:30:22.365085 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 12:30:14.470622816 +0000 UTC Jan 20 18:30:23 crc kubenswrapper[4773]: I0120 18:30:23.365272 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 00:00:26.103562578 +0000 UTC Jan 20 18:30:24 crc kubenswrapper[4773]: I0120 18:30:24.366420 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 06:56:38.061066167 +0000 UTC Jan 20 18:30:24 crc kubenswrapper[4773]: I0120 18:30:24.837436 4773 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 20 18:30:25 crc kubenswrapper[4773]: I0120 18:30:25.366677 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-12 08:53:20.863884946 +0000 UTC Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.252203 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.254036 4773 trace.go:236] Trace[38491623]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 18:30:14.421) (total time: 11832ms): Jan 20 18:30:26 crc kubenswrapper[4773]: Trace[38491623]: ---"Objects listed" error: 11832ms (18:30:26.253) Jan 20 18:30:26 crc kubenswrapper[4773]: Trace[38491623]: [11.832718825s] [11.832718825s] END Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.254063 4773 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.254472 4773 trace.go:236] Trace[89087432]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Jan-2026 18:30:11.458) (total time: 14795ms): Jan 20 18:30:26 crc kubenswrapper[4773]: Trace[89087432]: ---"Objects listed" error: 14795ms (18:30:26.254) Jan 20 18:30:26 crc kubenswrapper[4773]: Trace[89087432]: [14.795846736s] [14.795846736s] END Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.254495 4773 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.255560 4773 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.256032 4773 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.258887 4773 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.274959 4773 csr.go:261] certificate signing request csr-l9ws6 is approved, waiting to be issued Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.283177 4773 csr.go:257] certificate signing request csr-l9ws6 is issued Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.356074 4773 apiserver.go:52] "Watching apiserver" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.359498 4773 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.359746 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.360296 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.360357 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.360408 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.360723 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.360801 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.361011 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.361131 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.361206 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.361264 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.364611 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.364824 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.364948 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.366577 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.366727 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.366736 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.366815 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.366813 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.366845 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 12:37:42.880814694 +0000 UTC Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.378112 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.416342 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.434246 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.447498 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456040 4773 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456341 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456403 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456428 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456451 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456474 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456496 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456514 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456534 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456553 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456574 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456593 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456614 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456633 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456652 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456703 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456706 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456722 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456744 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456764 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456758 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456790 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456813 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456838 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456830 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456864 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456888 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456892 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456963 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.456987 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457010 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457039 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457056 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457064 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457092 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457101 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457127 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457165 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457188 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457210 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457233 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457253 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457275 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457295 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457312 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457329 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457349 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457368 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457387 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457387 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457410 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457431 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457449 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457466 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457483 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457501 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457517 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457534 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457553 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457570 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457588 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457611 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457633 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457653 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457670 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457691 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457712 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457730 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457803 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457822 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457843 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457864 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457881 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457900 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457920 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457964 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457997 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458027 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458049 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458071 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458095 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458113 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458140 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458161 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458180 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458207 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458226 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458245 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458268 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458288 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458311 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458339 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458363 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458381 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458399 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458417 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458437 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458455 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458482 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458502 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458520 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458538 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458556 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458574 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458591 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458608 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458629 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458650 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458668 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458687 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458710 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458727 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458745 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458762 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458783 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458803 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458820 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458839 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458858 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458875 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458892 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458911 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458954 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458976 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458992 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459012 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459030 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459052 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459075 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459096 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459114 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459134 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459153 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459169 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459186 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459209 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457583 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459231 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457609 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457772 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457817 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457887 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458052 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458072 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458107 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458124 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458132 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458159 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458256 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458310 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458443 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458458 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458513 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458619 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458632 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458666 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458716 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458824 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.458886 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459006 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459312 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459408 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459466 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459519 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459688 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459689 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459251 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459761 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459769 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459901 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459913 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459950 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.459981 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460013 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460041 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460065 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460089 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460112 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460138 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460162 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460185 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460211 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460234 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460254 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460270 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460287 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460306 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460323 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460339 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460358 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460376 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460396 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460415 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460435 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460454 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460471 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460487 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460504 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460521 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460537 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460554 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460572 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460590 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460607 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460623 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460693 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460714 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460732 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460756 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461276 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461300 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461317 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461334 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461351 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461369 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461391 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461409 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461425 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461440 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461456 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461472 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461691 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461713 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461730 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461751 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461770 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461787 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461804 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461820 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461838 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461862 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461881 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461904 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461920 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461952 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.461995 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462018 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462039 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462060 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462087 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460003 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460017 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460184 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460371 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460447 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.460498 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.462160 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:30:26.962131115 +0000 UTC m=+19.883944249 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466648 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466880 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466905 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466981 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466956 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.467063 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467073 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467124 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467126 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.467159 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:26.967137639 +0000 UTC m=+19.888950663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467238 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467325 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467391 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467435 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467471 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467510 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467625 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467646 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467671 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467685 4773 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467698 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467711 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467725 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467738 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467752 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467766 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467781 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467810 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467825 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467840 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467855 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467868 4773 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467882 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467895 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467911 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467924 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467966 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467979 4773 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467991 4773 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468004 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468017 4773 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468029 4773 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468043 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468056 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468069 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468082 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468095 4773 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468118 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468133 4773 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468145 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468158 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468171 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468184 4773 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468198 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468213 4773 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468228 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468241 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468256 4773 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468269 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468279 4773 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468333 4773 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468344 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468354 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468366 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468376 4773 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468385 4773 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.469189 4773 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.457849 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467394 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467434 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462658 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462654 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462685 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463102 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463139 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463467 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463485 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463493 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463590 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.463891 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.464022 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.474727 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.464237 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.464299 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.464455 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.464498 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.464805 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.465227 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.465292 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.465807 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.465958 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466295 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466438 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.466492 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.476331 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467718 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.467823 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468101 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468116 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468185 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.462430 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.476432 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468307 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468390 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468418 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468701 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468793 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468758 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.468834 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.469384 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.471765 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.472029 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.472598 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.472866 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.473290 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.473518 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.474737 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.475013 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.475289 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.475438 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.475550 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.476107 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.476195 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.477113 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.477188 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.477772 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.478133 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.478686 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.478686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.477456 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.478893 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.479163 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.479307 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.480016 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.482622 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:26.982588485 +0000 UTC m=+19.904401499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.486011 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.486048 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.486154 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.486475 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.486578 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.486654 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.486798 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:26.986774252 +0000 UTC m=+19.908587496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.486906 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.488347 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.488354 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.488809 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.489068 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.489201 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.490447 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.489544 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.489545 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.489788 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.490019 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.490187 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.490211 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.490246 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.490316 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.491129 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.491173 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.491186 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.491259 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:26.991240366 +0000 UTC m=+19.913053390 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.491760 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.492590 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.492945 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.493061 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.493129 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.493181 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.493229 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.493096 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.494220 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.494395 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.496638 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.496941 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.497191 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.497594 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.497629 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.497904 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.498231 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.498293 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.498473 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.498540 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.499030 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.499295 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.499565 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.499887 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.501223 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.503889 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.504543 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.504600 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.505844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.506578 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.507093 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.507277 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.507414 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.507819 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.507761 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.507900 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508114 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508186 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508240 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508302 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508549 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508665 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.508700 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.510870 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.511001 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.511146 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.511422 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.511443 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.511540 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.512209 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.512285 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.512758 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.513061 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.513079 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.513317 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.513324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.513690 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.514010 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.515463 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.515752 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.518590 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.521650 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.522650 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.528023 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.528379 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.528423 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.529402 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.533349 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.536705 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.538714 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.554146 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.555754 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.558434 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.567767 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f" exitCode=255 Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.567823 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f"} Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569295 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569228 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569377 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569551 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569678 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569695 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569713 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569725 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569736 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569752 4773 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569764 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569775 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569786 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569799 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569811 4773 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569822 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569837 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569864 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569876 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569888 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569898 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569914 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569926 4773 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569955 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569972 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569983 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.569994 4773 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570008 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570038 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570052 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570063 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570074 4773 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570089 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570109 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570121 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570136 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570153 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570165 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570178 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570205 4773 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570216 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570229 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570242 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570258 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570270 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570281 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570293 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570307 4773 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570318 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570329 4773 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570346 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570370 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570382 4773 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570394 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570409 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570420 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570431 4773 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570442 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570456 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570467 4773 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570479 4773 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570494 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570506 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570517 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570529 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570547 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570558 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570570 4773 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570583 4773 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570598 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570610 4773 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570621 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570633 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570648 4773 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570661 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570674 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570690 4773 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570702 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570715 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570729 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570746 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570760 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570781 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570794 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570811 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570823 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570837 4773 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570853 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570866 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570878 4773 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570889 4773 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570906 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570918 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570960 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570973 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.570989 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571001 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571017 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571030 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571045 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571057 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571068 4773 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571083 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571095 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571107 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571118 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571134 4773 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571145 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571158 4773 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571169 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571184 4773 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571196 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571209 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571237 4773 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571250 4773 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571262 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571274 4773 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571289 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571301 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571312 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571326 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571340 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571352 4773 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571369 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571381 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571396 4773 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571409 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571421 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571437 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571452 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571464 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571476 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571491 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571503 4773 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571515 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571527 4773 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571542 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571553 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571564 4773 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571579 4773 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571591 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571604 4773 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571615 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571631 4773 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571646 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571659 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571672 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.571688 4773 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.594148 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.610029 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.621158 4773 scope.go:117] "RemoveContainer" containerID="fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.628455 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.631327 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.631570 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-gczfj"] Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.632027 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.636650 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.636889 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.637050 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.656533 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.669212 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.674267 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4b9d\" (UniqueName: \"kubernetes.io/projected/357ca347-8fa9-4f0b-9f49-a540f14e0198-kube-api-access-h4b9d\") pod \"node-resolver-gczfj\" (UID: \"357ca347-8fa9-4f0b-9f49-a540f14e0198\") " pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.674319 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/357ca347-8fa9-4f0b-9f49-a540f14e0198-hosts-file\") pod \"node-resolver-gczfj\" (UID: \"357ca347-8fa9-4f0b-9f49-a540f14e0198\") " pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.680053 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.684464 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.685265 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.689829 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.694267 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: W0120 18:30:26.707081 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-d61ad24c2c5cdc4877b12ad3276b745e397f53ce14b72dcdf4ce69c08c8e843e WatchSource:0}: Error finding container d61ad24c2c5cdc4877b12ad3276b745e397f53ce14b72dcdf4ce69c08c8e843e: Status 404 returned error can't find the container with id d61ad24c2c5cdc4877b12ad3276b745e397f53ce14b72dcdf4ce69c08c8e843e Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.709130 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: W0120 18:30:26.712727 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-1ec8e5f42c66c2301f37808211b82a13819dd6b0136a32c03e83458988c5e720 WatchSource:0}: Error finding container 1ec8e5f42c66c2301f37808211b82a13819dd6b0136a32c03e83458988c5e720: Status 404 returned error can't find the container with id 1ec8e5f42c66c2301f37808211b82a13819dd6b0136a32c03e83458988c5e720 Jan 20 18:30:26 crc kubenswrapper[4773]: W0120 18:30:26.717485 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-c6c2468ce835a67e0b9197a4f44130d0180322e851f7889cd0fd4329018a9be5 WatchSource:0}: Error finding container c6c2468ce835a67e0b9197a4f44130d0180322e851f7889cd0fd4329018a9be5: Status 404 returned error can't find the container with id c6c2468ce835a67e0b9197a4f44130d0180322e851f7889cd0fd4329018a9be5 Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.720332 4773 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 18:30:26 crc kubenswrapper[4773]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Jan 20 18:30:26 crc kubenswrapper[4773]: set -o allexport Jan 20 18:30:26 crc kubenswrapper[4773]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Jan 20 18:30:26 crc kubenswrapper[4773]: source /etc/kubernetes/apiserver-url.env Jan 20 18:30:26 crc kubenswrapper[4773]: else Jan 20 18:30:26 crc kubenswrapper[4773]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Jan 20 18:30:26 crc kubenswrapper[4773]: exit 1 Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Jan 20 18:30:26 crc kubenswrapper[4773]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 20 18:30:26 crc kubenswrapper[4773]: > logger="UnhandledError" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.721652 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.726840 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.732609 4773 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 18:30:26 crc kubenswrapper[4773]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Jan 20 18:30:26 crc kubenswrapper[4773]: if [[ -f "/env/_master" ]]; then Jan 20 18:30:26 crc kubenswrapper[4773]: set -o allexport Jan 20 18:30:26 crc kubenswrapper[4773]: source "/env/_master" Jan 20 18:30:26 crc kubenswrapper[4773]: set +o allexport Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Jan 20 18:30:26 crc kubenswrapper[4773]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Jan 20 18:30:26 crc kubenswrapper[4773]: ho_enable="--enable-hybrid-overlay" Jan 20 18:30:26 crc kubenswrapper[4773]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Jan 20 18:30:26 crc kubenswrapper[4773]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Jan 20 18:30:26 crc kubenswrapper[4773]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Jan 20 18:30:26 crc kubenswrapper[4773]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Jan 20 18:30:26 crc kubenswrapper[4773]: --webhook-cert-dir="/etc/webhook-cert" \ Jan 20 18:30:26 crc kubenswrapper[4773]: --webhook-host=127.0.0.1 \ Jan 20 18:30:26 crc kubenswrapper[4773]: --webhook-port=9743 \ Jan 20 18:30:26 crc kubenswrapper[4773]: ${ho_enable} \ Jan 20 18:30:26 crc kubenswrapper[4773]: --enable-interconnect \ Jan 20 18:30:26 crc kubenswrapper[4773]: --disable-approver \ Jan 20 18:30:26 crc kubenswrapper[4773]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Jan 20 18:30:26 crc kubenswrapper[4773]: --wait-for-kubernetes-api=200s \ Jan 20 18:30:26 crc kubenswrapper[4773]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Jan 20 18:30:26 crc kubenswrapper[4773]: --loglevel="${LOGLEVEL}" Jan 20 18:30:26 crc kubenswrapper[4773]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 20 18:30:26 crc kubenswrapper[4773]: > logger="UnhandledError" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.732984 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.734043 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.739333 4773 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 18:30:26 crc kubenswrapper[4773]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Jan 20 18:30:26 crc kubenswrapper[4773]: if [[ -f "/env/_master" ]]; then Jan 20 18:30:26 crc kubenswrapper[4773]: set -o allexport Jan 20 18:30:26 crc kubenswrapper[4773]: source "/env/_master" Jan 20 18:30:26 crc kubenswrapper[4773]: set +o allexport Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Jan 20 18:30:26 crc kubenswrapper[4773]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Jan 20 18:30:26 crc kubenswrapper[4773]: --disable-webhook \ Jan 20 18:30:26 crc kubenswrapper[4773]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Jan 20 18:30:26 crc kubenswrapper[4773]: --loglevel="${LOGLEVEL}" Jan 20 18:30:26 crc kubenswrapper[4773]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 20 18:30:26 crc kubenswrapper[4773]: > logger="UnhandledError" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.741200 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.747385 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.768995 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.776420 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4b9d\" (UniqueName: \"kubernetes.io/projected/357ca347-8fa9-4f0b-9f49-a540f14e0198-kube-api-access-h4b9d\") pod \"node-resolver-gczfj\" (UID: \"357ca347-8fa9-4f0b-9f49-a540f14e0198\") " pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.776467 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/357ca347-8fa9-4f0b-9f49-a540f14e0198-hosts-file\") pod \"node-resolver-gczfj\" (UID: \"357ca347-8fa9-4f0b-9f49-a540f14e0198\") " pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.776545 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/357ca347-8fa9-4f0b-9f49-a540f14e0198-hosts-file\") pod \"node-resolver-gczfj\" (UID: \"357ca347-8fa9-4f0b-9f49-a540f14e0198\") " pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.787701 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.801810 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.813123 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4b9d\" (UniqueName: \"kubernetes.io/projected/357ca347-8fa9-4f0b-9f49-a540f14e0198-kube-api-access-h4b9d\") pod \"node-resolver-gczfj\" (UID: \"357ca347-8fa9-4f0b-9f49-a540f14e0198\") " pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.815960 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.950118 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-gczfj" Jan 20 18:30:26 crc kubenswrapper[4773]: W0120 18:30:26.961745 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod357ca347_8fa9_4f0b_9f49_a540f14e0198.slice/crio-7eacb55e9190716b5ef650c1ade86dd203f9284ce9646e6bce36d8e33bf04e86 WatchSource:0}: Error finding container 7eacb55e9190716b5ef650c1ade86dd203f9284ce9646e6bce36d8e33bf04e86: Status 404 returned error can't find the container with id 7eacb55e9190716b5ef650c1ade86dd203f9284ce9646e6bce36d8e33bf04e86 Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.964063 4773 kuberuntime_manager.go:1274] "Unhandled Error" err=< Jan 20 18:30:26 crc kubenswrapper[4773]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Jan 20 18:30:26 crc kubenswrapper[4773]: set -uo pipefail Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Jan 20 18:30:26 crc kubenswrapper[4773]: HOSTS_FILE="/etc/hosts" Jan 20 18:30:26 crc kubenswrapper[4773]: TEMP_FILE="/etc/hosts.tmp" Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: IFS=', ' read -r -a services <<< "${SERVICES}" Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: # Make a temporary file with the old hosts file's attributes. Jan 20 18:30:26 crc kubenswrapper[4773]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Jan 20 18:30:26 crc kubenswrapper[4773]: echo "Failed to preserve hosts file. Exiting." Jan 20 18:30:26 crc kubenswrapper[4773]: exit 1 Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: while true; do Jan 20 18:30:26 crc kubenswrapper[4773]: declare -A svc_ips Jan 20 18:30:26 crc kubenswrapper[4773]: for svc in "${services[@]}"; do Jan 20 18:30:26 crc kubenswrapper[4773]: # Fetch service IP from cluster dns if present. We make several tries Jan 20 18:30:26 crc kubenswrapper[4773]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Jan 20 18:30:26 crc kubenswrapper[4773]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Jan 20 18:30:26 crc kubenswrapper[4773]: # support UDP loadbalancers and require reaching DNS through TCP. Jan 20 18:30:26 crc kubenswrapper[4773]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 20 18:30:26 crc kubenswrapper[4773]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 20 18:30:26 crc kubenswrapper[4773]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Jan 20 18:30:26 crc kubenswrapper[4773]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Jan 20 18:30:26 crc kubenswrapper[4773]: for i in ${!cmds[*]} Jan 20 18:30:26 crc kubenswrapper[4773]: do Jan 20 18:30:26 crc kubenswrapper[4773]: ips=($(eval "${cmds[i]}")) Jan 20 18:30:26 crc kubenswrapper[4773]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Jan 20 18:30:26 crc kubenswrapper[4773]: svc_ips["${svc}"]="${ips[@]}" Jan 20 18:30:26 crc kubenswrapper[4773]: break Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: done Jan 20 18:30:26 crc kubenswrapper[4773]: done Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: # Update /etc/hosts only if we get valid service IPs Jan 20 18:30:26 crc kubenswrapper[4773]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Jan 20 18:30:26 crc kubenswrapper[4773]: # Stale entries could exist in /etc/hosts if the service is deleted Jan 20 18:30:26 crc kubenswrapper[4773]: if [[ -n "${svc_ips[*]-}" ]]; then Jan 20 18:30:26 crc kubenswrapper[4773]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Jan 20 18:30:26 crc kubenswrapper[4773]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Jan 20 18:30:26 crc kubenswrapper[4773]: # Only continue rebuilding the hosts entries if its original content is preserved Jan 20 18:30:26 crc kubenswrapper[4773]: sleep 60 & wait Jan 20 18:30:26 crc kubenswrapper[4773]: continue Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: # Append resolver entries for services Jan 20 18:30:26 crc kubenswrapper[4773]: rc=0 Jan 20 18:30:26 crc kubenswrapper[4773]: for svc in "${!svc_ips[@]}"; do Jan 20 18:30:26 crc kubenswrapper[4773]: for ip in ${svc_ips[${svc}]}; do Jan 20 18:30:26 crc kubenswrapper[4773]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Jan 20 18:30:26 crc kubenswrapper[4773]: done Jan 20 18:30:26 crc kubenswrapper[4773]: done Jan 20 18:30:26 crc kubenswrapper[4773]: if [[ $rc -ne 0 ]]; then Jan 20 18:30:26 crc kubenswrapper[4773]: sleep 60 & wait Jan 20 18:30:26 crc kubenswrapper[4773]: continue Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: Jan 20 18:30:26 crc kubenswrapper[4773]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Jan 20 18:30:26 crc kubenswrapper[4773]: # Replace /etc/hosts with our modified version if needed Jan 20 18:30:26 crc kubenswrapper[4773]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Jan 20 18:30:26 crc kubenswrapper[4773]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Jan 20 18:30:26 crc kubenswrapper[4773]: fi Jan 20 18:30:26 crc kubenswrapper[4773]: sleep 60 & wait Jan 20 18:30:26 crc kubenswrapper[4773]: unset svc_ips Jan 20 18:30:26 crc kubenswrapper[4773]: done Jan 20 18:30:26 crc kubenswrapper[4773]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h4b9d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-gczfj_openshift-dns(357ca347-8fa9-4f0b-9f49-a540f14e0198): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Jan 20 18:30:26 crc kubenswrapper[4773]: > logger="UnhandledError" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.967033 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-gczfj" podUID="357ca347-8fa9-4f0b-9f49-a540f14e0198" Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.978184 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:26 crc kubenswrapper[4773]: I0120 18:30:26.978260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.978306 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:30:27.978275981 +0000 UTC m=+20.900089015 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.978382 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:26 crc kubenswrapper[4773]: E0120 18:30:26.978437 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:27.978424635 +0000 UTC m=+20.900237659 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.079558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.079614 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.079636 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.079765 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.079780 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.079790 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.079845 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:28.079817991 +0000 UTC m=+21.001631015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.080437 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.080529 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:28.080510228 +0000 UTC m=+21.002323252 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.080449 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.080671 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.080733 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.080833 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:28.080816534 +0000 UTC m=+21.002629558 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.167332 4773 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.255583 4773 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256011 4773 reflector.go:484] object-"openshift-network-node-identity"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256016 4773 reflector.go:484] object-"openshift-network-node-identity"/"env-overrides": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"env-overrides": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256067 4773 reflector.go:484] object-"openshift-network-operator"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256134 4773 reflector.go:484] object-"openshift-network-node-identity"/"ovnkube-identity-cm": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"ovnkube-identity-cm": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256159 4773 reflector.go:484] k8s.io/client-go/informers/factory.go:160: watch of *v1.Service ended with: very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256168 4773 reflector.go:484] object-"openshift-network-operator"/"metrics-tls": watch of *v1.Secret ended with: very short watch: object-"openshift-network-operator"/"metrics-tls": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256223 4773 reflector.go:484] object-"openshift-network-operator"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256252 4773 reflector.go:484] object-"openshift-network-node-identity"/"network-node-identity-cert": watch of *v1.Secret ended with: very short watch: object-"openshift-network-node-identity"/"network-node-identity-cert": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256254 4773 reflector.go:484] object-"openshift-network-node-identity"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-node-identity"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256255 4773 reflector.go:484] object-"openshift-dns"/"kube-root-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"kube-root-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256272 4773 reflector.go:484] object-"openshift-dns"/"openshift-service-ca.crt": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-dns"/"openshift-service-ca.crt": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.256282 4773 reflector.go:484] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": watch of *v1.Secret ended with: very short watch: object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.257196 4773 reflector.go:484] object-"openshift-network-operator"/"iptables-alerter-script": watch of *v1.ConfigMap ended with: very short watch: object-"openshift-network-operator"/"iptables-alerter-script": Unexpected watch close - watch lasted less than a second and no items received Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.267965 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.283334 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.284322 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-20 18:25:26 +0000 UTC, rotation deadline is 2026-10-15 22:00:25.3154403 +0000 UTC Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.284384 4773 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6435h29m58.031058844s for next certificate rotation Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.294220 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.303458 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.313819 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.326220 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.335290 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.346272 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.354573 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.367704 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 12:49:16.588012406 +0000 UTC Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.371972 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-kjbfj"] Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.372619 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.373020 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-sq4x7"] Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.373305 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bccxn"] Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.373430 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.373502 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.374503 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.374735 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.374742 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.375489 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.375736 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.376145 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.376202 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.376290 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.376635 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.376753 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.377303 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.378333 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qt89w"] Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.379022 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.381215 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.383182 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.383419 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.383569 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.383700 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.383798 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.384126 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.386573 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.390439 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.406537 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.415699 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.424424 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.435114 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.443145 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.446295 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:27 crc kubenswrapper[4773]: E0120 18:30:27.446547 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.450561 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.451105 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.451950 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.452594 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.453152 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.453663 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.455220 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.455566 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.455739 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.456891 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.457497 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.458514 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.459291 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.459833 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.461348 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.461836 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.462677 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.463367 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.463725 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.464701 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.465276 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.466089 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.466683 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.467129 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.468075 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.468112 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.468815 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.470040 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.470756 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.471817 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.472483 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.473452 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.474016 4773 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.474136 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.476453 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.476909 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.477364 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.479447 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.479570 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.481005 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.481628 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.482892 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.483658 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.483884 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-node-log\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.483949 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-slash\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.483973 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-netns\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.483995 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484045 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-netd\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484071 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/061a607e-1868-4fcf-b3ea-d51157511d41-cni-binary-copy\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484091 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-system-cni-dir\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484114 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-os-release\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484138 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ddd5104-3112-413e-b908-2b7f336b41f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484159 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1ddd934f-f012-4083-b5e6-b99711071621-rootfs\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484181 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-multus-certs\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-os-release\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484226 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-k8s-cni-cncf-io\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484247 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-kubelet\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484267 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovn-node-metrics-cert\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484288 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-64lq8\" (UniqueName: \"kubernetes.io/projected/1ddd934f-f012-4083-b5e6-b99711071621-kube-api-access-64lq8\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484310 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ddd5104-3112-413e-b908-2b7f336b41f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484330 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484349 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-log-socket\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484368 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-bin\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484399 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7r5p\" (UniqueName: \"kubernetes.io/projected/7ddd5104-3112-413e-b908-2b7f336b41f1-kube-api-access-q7r5p\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484420 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-env-overrides\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484441 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-system-cni-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484458 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ddd934f-f012-4083-b5e6-b99711071621-proxy-tls\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484478 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-cnibin\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484503 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-systemd\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484524 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484544 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-etc-kubernetes\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484564 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-var-lib-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-cni-bin\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-hostroot\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484639 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-config\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484660 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9flh4\" (UniqueName: \"kubernetes.io/projected/f354424d-7f22-42d6-8bd9-00e32e78c3d3-kube-api-access-9flh4\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484684 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-script-lib\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484703 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-cnibin\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484726 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-cni-multus\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484747 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-kubelet\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484768 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-netns\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484790 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ddd934f-f012-4083-b5e6-b99711071621-mcd-auth-proxy-config\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484813 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-socket-dir-parent\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484834 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwtw7\" (UniqueName: \"kubernetes.io/projected/061a607e-1868-4fcf-b3ea-d51157511d41-kube-api-access-mwtw7\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484860 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-etc-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484883 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-ovn\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484915 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-conf-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484958 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.484981 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-systemd-units\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.485000 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-cni-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.485020 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/061a607e-1868-4fcf-b3ea-d51157511d41-multus-daemon-config\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.486426 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.487162 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.488490 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.489865 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.490396 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.491594 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.491942 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.492487 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.493911 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.494542 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.495119 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.496187 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.496898 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.498080 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.498623 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.502410 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.511747 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.520460 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.531375 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.540396 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.552277 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.561027 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.570717 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c6c2468ce835a67e0b9197a4f44130d0180322e851f7889cd0fd4329018a9be5"} Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.571769 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"1ec8e5f42c66c2301f37808211b82a13819dd6b0136a32c03e83458988c5e720"} Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.573679 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.577678 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.577972 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da"} Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.578184 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.581580 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gczfj" event={"ID":"357ca347-8fa9-4f0b-9f49-a540f14e0198","Type":"ContainerStarted","Data":"7eacb55e9190716b5ef650c1ade86dd203f9284ce9646e6bce36d8e33bf04e86"} Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.582514 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585698 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q7r5p\" (UniqueName: \"kubernetes.io/projected/7ddd5104-3112-413e-b908-2b7f336b41f1-kube-api-access-q7r5p\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585729 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-log-socket\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585747 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-bin\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585762 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-system-cni-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585777 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ddd934f-f012-4083-b5e6-b99711071621-proxy-tls\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585792 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-env-overrides\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585807 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-cnibin\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585823 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-systemd\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585837 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585852 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-etc-kubernetes\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585867 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-var-lib-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585881 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-cni-bin\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-hostroot\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585908 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-config\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585921 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9flh4\" (UniqueName: \"kubernetes.io/projected/f354424d-7f22-42d6-8bd9-00e32e78c3d3-kube-api-access-9flh4\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585958 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-script-lib\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585976 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-cnibin\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.585990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-cni-multus\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.586008 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-netns\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.586023 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ddd934f-f012-4083-b5e6-b99711071621-mcd-auth-proxy-config\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.586040 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-kubelet\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.586054 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-socket-dir-parent\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.586068 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mwtw7\" (UniqueName: \"kubernetes.io/projected/061a607e-1868-4fcf-b3ea-d51157511d41-kube-api-access-mwtw7\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-etc-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593631 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-ovn\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593687 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-conf-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593715 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593738 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-systemd-units\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593760 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-cni-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593784 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/061a607e-1868-4fcf-b3ea-d51157511d41-multus-daemon-config\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593862 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-node-log\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593888 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-netns\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593914 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-slash\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593964 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.593990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-netd\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594011 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/061a607e-1868-4fcf-b3ea-d51157511d41-cni-binary-copy\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594039 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-os-release\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594065 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ddd5104-3112-413e-b908-2b7f336b41f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1ddd934f-f012-4083-b5e6-b99711071621-rootfs\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594110 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-system-cni-dir\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594135 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-os-release\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594157 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-k8s-cni-cncf-io\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594178 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-kubelet\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594199 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-multus-certs\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594223 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ddd5104-3112-413e-b908-2b7f336b41f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594243 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594265 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovn-node-metrics-cert\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594286 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-64lq8\" (UniqueName: \"kubernetes.io/projected/1ddd934f-f012-4083-b5e6-b99711071621-kube-api-access-64lq8\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.587990 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-var-lib-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594679 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-etc-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588002 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-cnibin\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594730 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-ovn\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588228 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-log-socket\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.594775 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-conf-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588257 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-bin\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588946 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-netns\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588870 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-kubelet\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595023 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-kubelet\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595107 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-os-release\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588868 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-env-overrides\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595125 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-os-release\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.589291 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1ddd934f-f012-4083-b5e6-b99711071621-mcd-auth-proxy-config\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595180 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-k8s-cni-cncf-io\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588749 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-hostroot\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588283 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-cni-bin\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595395 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-systemd-units\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588301 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-system-cni-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595603 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-cni-dir\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588340 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-systemd\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595661 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/061a607e-1868-4fcf-b3ea-d51157511d41-cni-binary-copy\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595681 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7ddd5104-3112-413e-b908-2b7f336b41f1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588905 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-var-lib-cni-multus\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595714 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-netns\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588786 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-cnibin\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595743 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/1ddd934f-f012-4083-b5e6-b99711071621-rootfs\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588909 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-multus-socket-dir-parent\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595759 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-slash\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588816 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595786 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-system-cni-dir\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588714 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-config\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595807 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-openvswitch\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588844 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-etc-kubernetes\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595827 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-netd\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.588818 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-script-lib\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595850 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-node-log\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595881 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/061a607e-1868-4fcf-b3ea-d51157511d41-host-run-multus-certs\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.595915 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-ovn-kubernetes\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.596204 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/061a607e-1868-4fcf-b3ea-d51157511d41-multus-daemon-config\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.596392 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7ddd5104-3112-413e-b908-2b7f336b41f1-cni-binary-copy\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.596442 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7ddd5104-3112-413e-b908-2b7f336b41f1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.597531 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.598055 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/1ddd934f-f012-4083-b5e6-b99711071621-proxy-tls\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.599685 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovn-node-metrics-cert\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.606736 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d61ad24c2c5cdc4877b12ad3276b745e397f53ce14b72dcdf4ce69c08c8e843e"} Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.610655 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q7r5p\" (UniqueName: \"kubernetes.io/projected/7ddd5104-3112-413e-b908-2b7f336b41f1-kube-api-access-q7r5p\") pod \"multus-additional-cni-plugins-kjbfj\" (UID: \"7ddd5104-3112-413e-b908-2b7f336b41f1\") " pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.611394 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9flh4\" (UniqueName: \"kubernetes.io/projected/f354424d-7f22-42d6-8bd9-00e32e78c3d3-kube-api-access-9flh4\") pod \"ovnkube-node-qt89w\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.612384 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwtw7\" (UniqueName: \"kubernetes.io/projected/061a607e-1868-4fcf-b3ea-d51157511d41-kube-api-access-mwtw7\") pod \"multus-bccxn\" (UID: \"061a607e-1868-4fcf-b3ea-d51157511d41\") " pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.616919 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.622050 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-64lq8\" (UniqueName: \"kubernetes.io/projected/1ddd934f-f012-4083-b5e6-b99711071621-kube-api-access-64lq8\") pod \"machine-config-daemon-sq4x7\" (UID: \"1ddd934f-f012-4083-b5e6-b99711071621\") " pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.630688 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.651177 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.667431 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.677464 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.685847 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.689998 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.689983 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.697908 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bccxn" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.699147 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.709342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.732353 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: W0120 18:30:27.745011 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddd934f_f012_4083_b5e6_b99711071621.slice/crio-a7070d5127f216a165b415262c3dcb42d697bb9807435e7df3df776e023e1b70 WatchSource:0}: Error finding container a7070d5127f216a165b415262c3dcb42d697bb9807435e7df3df776e023e1b70: Status 404 returned error can't find the container with id a7070d5127f216a165b415262c3dcb42d697bb9807435e7df3df776e023e1b70 Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.771370 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.792314 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.811808 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.821466 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.833921 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.844819 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.845891 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.850145 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.859697 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.860397 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.885986 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.922188 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:27 crc kubenswrapper[4773]: I0120 18:30:27.959547 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:27.999997 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.000108 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.000176 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:30:30.000151193 +0000 UTC m=+22.921964237 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.000209 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.000273 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:30.000256485 +0000 UTC m=+22.922069559 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.000433 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.039084 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.086338 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.101124 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.101183 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.101211 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101326 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101339 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101391 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101405 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101420 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:30.101398477 +0000 UTC m=+23.023211501 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101339 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101472 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101479 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:30.101462358 +0000 UTC m=+23.023275442 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101484 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.101525 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:30.10151389 +0000 UTC m=+23.023326984 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.123300 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.161757 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.174978 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.214053 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.233919 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.267988 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.274476 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.323166 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.368213 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 06:56:51.38414991 +0000 UTC Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.369759 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.394332 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.425838 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.446256 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.446288 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.446436 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:28 crc kubenswrapper[4773]: E0120 18:30:28.446519 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.462660 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.503785 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.514081 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.574072 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.605702 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.610841 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3" exitCode=0 Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.610965 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.610997 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"410001a1d3881fa68033cb522fb1036ff5be18d13872f61c3fe53b410c458aa8"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.612843 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.612900 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.612914 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"a7070d5127f216a165b415262c3dcb42d697bb9807435e7df3df776e023e1b70"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.615367 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ddd5104-3112-413e-b908-2b7f336b41f1" containerID="cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745" exitCode=0 Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.615431 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerDied","Data":"cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.615454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerStarted","Data":"6fcca40afa0a2739c1fcbd5cffd85ebe7836e640caaaf2f50d873e009384d4a3"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.620560 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.620598 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.622839 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerStarted","Data":"5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.622904 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerStarted","Data":"375987ce103d4dda6ae8622d9203ab8286d343e5a762d0690e4f438712cdf1f0"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.625317 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-gczfj" event={"ID":"357ca347-8fa9-4f0b-9f49-a540f14e0198","Type":"ContainerStarted","Data":"48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.626977 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda"} Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.648676 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.674188 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.694879 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.733899 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.738007 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.782406 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.813891 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.854545 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.866061 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.874806 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.924269 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:28 crc kubenswrapper[4773]: I0120 18:30:28.962309 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:28Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.004620 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.045397 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.086690 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.126551 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.182446 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.206123 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.245983 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.277671 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-5sv79"] Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.278108 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.285344 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.296351 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.314241 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.319404 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-serviceca\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.319452 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dls8p\" (UniqueName: \"kubernetes.io/projected/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-kube-api-access-dls8p\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.319485 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-host\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.334557 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.353257 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.368331 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 22:47:10.49736871 +0000 UTC Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.405958 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.420214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dls8p\" (UniqueName: \"kubernetes.io/projected/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-kube-api-access-dls8p\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.420266 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-host\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.420307 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-serviceca\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.420367 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-host\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.421280 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-serviceca\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.443608 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.446828 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:29 crc kubenswrapper[4773]: E0120 18:30:29.446956 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.471515 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dls8p\" (UniqueName: \"kubernetes.io/projected/5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7-kube-api-access-dls8p\") pod \"node-ca-5sv79\" (UID: \"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\") " pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.503105 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.544131 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.590216 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.607900 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5sv79" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.623888 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.633654 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1"} Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.633697 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5"} Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.633707 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e"} Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.633715 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc"} Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.633723 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455"} Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.635488 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ddd5104-3112-413e-b908-2b7f336b41f1" containerID="1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e" exitCode=0 Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.635543 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerDied","Data":"1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e"} Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.662226 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: W0120 18:30:29.690033 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a565d2f_43a1_41f5_b7a6_85d7d0aea0a7.slice/crio-08e407ee5d3c613bc9cd922baa1a3e8005e962004ac4422e700af5f9e99e4c1d WatchSource:0}: Error finding container 08e407ee5d3c613bc9cd922baa1a3e8005e962004ac4422e700af5f9e99e4c1d: Status 404 returned error can't find the container with id 08e407ee5d3c613bc9cd922baa1a3e8005e962004ac4422e700af5f9e99e4c1d Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.702650 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.750190 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.782178 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.827607 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.866186 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.887564 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.900769 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.906029 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.927443 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 20 18:30:29 crc kubenswrapper[4773]: I0120 18:30:29.963394 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:29Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.028152 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.028249 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:30:34.028233087 +0000 UTC m=+26.950046111 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.028378 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.028473 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.028517 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:34.028505663 +0000 UTC m=+26.950318687 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.030369 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.063370 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.083925 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.124482 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.128950 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.128999 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.129025 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129146 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129170 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129192 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129190 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129206 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129217 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:34.129199204 +0000 UTC m=+27.051012228 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129221 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129238 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129258 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:34.129241995 +0000 UTC m=+27.051055059 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.129286 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:34.129271545 +0000 UTC m=+27.051084569 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.161449 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.202250 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.240681 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.285472 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.322660 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.361689 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.368788 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 14:31:27.096957975 +0000 UTC Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.400886 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.441897 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.446983 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.447090 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.447120 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:30 crc kubenswrapper[4773]: E0120 18:30:30.447291 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.484229 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.527864 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.564200 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.603243 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.639923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5sv79" event={"ID":"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7","Type":"ContainerStarted","Data":"d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39"} Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.639995 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5sv79" event={"ID":"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7","Type":"ContainerStarted","Data":"08e407ee5d3c613bc9cd922baa1a3e8005e962004ac4422e700af5f9e99e4c1d"} Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.641610 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e"} Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.643735 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ddd5104-3112-413e-b908-2b7f336b41f1" containerID="7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f" exitCode=0 Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.643776 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerDied","Data":"7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f"} Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.645267 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.651077 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467"} Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.685648 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.724016 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.772672 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.807252 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.844534 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.883155 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.924872 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:30 crc kubenswrapper[4773]: I0120 18:30:30.966755 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:30Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.006756 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.043036 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.081988 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.129795 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.168148 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.222084 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.244176 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.283672 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.322293 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.364158 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.369276 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 23:03:17.68404031 +0000 UTC Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.405034 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.445695 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.446082 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:31 crc kubenswrapper[4773]: E0120 18:30:31.446240 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.655166 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ddd5104-3112-413e-b908-2b7f336b41f1" containerID="8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2" exitCode=0 Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.655216 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerDied","Data":"8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2"} Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.670565 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.682698 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.701780 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.722423 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.747322 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.770249 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.785955 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.797525 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.807909 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.842034 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.886883 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.921623 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:31 crc kubenswrapper[4773]: I0120 18:30:31.964843 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.001794 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:31Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.040577 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.370400 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 07:01:28.984350076 +0000 UTC Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.446084 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.446226 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.446249 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.446472 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.656306 4773 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.658166 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.658196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.658204 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.658299 4773 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.661221 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ddd5104-3112-413e-b908-2b7f336b41f1" containerID="53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce" exitCode=0 Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.661292 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerDied","Data":"53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.666579 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.669912 4773 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.670655 4773 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.671766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.671868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.671961 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.672040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.672110 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.678559 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.686452 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.689975 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.690028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.690040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.690057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.690069 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.692828 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.703328 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.706633 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.707390 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.707431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.707442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.707458 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.707469 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.720697 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.723901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.723941 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.723950 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.723964 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.723973 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.735171 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.738660 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.741034 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.741067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.741077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.741089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.741097 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.749493 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.752216 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: E0120 18:30:32.755574 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.758152 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.758188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.758197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.758210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.758218 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.763146 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.776827 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.789444 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.798264 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.806744 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.820742 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.833908 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.852562 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.860347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.860390 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.860402 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.860418 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.860429 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.870782 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.886140 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:32Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.963485 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.963527 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.963541 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.963560 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:32 crc kubenswrapper[4773]: I0120 18:30:32.963574 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:32Z","lastTransitionTime":"2026-01-20T18:30:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.066263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.066309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.066320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.066333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.066342 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.168664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.168695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.168703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.168715 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.168724 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.271049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.271095 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.271109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.271127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.271138 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.370777 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-04 16:23:28.091367264 +0000 UTC Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.373744 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.373782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.373794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.373811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.373844 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.446438 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:33 crc kubenswrapper[4773]: E0120 18:30:33.446690 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.476812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.476870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.476881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.476902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.477280 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.580376 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.580429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.580441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.580460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.580471 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682235 4773 generic.go:334] "Generic (PLEG): container finished" podID="7ddd5104-3112-413e-b908-2b7f336b41f1" containerID="a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157" exitCode=0 Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682333 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerDied","Data":"a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682887 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.682925 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.709208 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.726882 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.744304 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.758992 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.777385 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.786211 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.786265 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.786280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.786299 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.786313 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.794304 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.820447 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.839232 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.855218 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.872843 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.890426 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.898473 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.898524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.898535 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.898554 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.898564 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:33Z","lastTransitionTime":"2026-01-20T18:30:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.909059 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.931612 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.948210 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:33 crc kubenswrapper[4773]: I0120 18:30:33.965155 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:33Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.001740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.001774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.001801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.001818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.001828 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.072626 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.072834 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.072974 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:30:42.072891229 +0000 UTC m=+34.994704253 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.073045 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.073121 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:42.073109184 +0000 UTC m=+34.994922208 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.105082 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.105136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.105147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.105169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.105182 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.174108 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.174182 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.174220 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174347 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174406 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174432 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174449 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174465 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174516 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174538 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174478 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:42.174450119 +0000 UTC m=+35.096263153 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174601 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:42.174586472 +0000 UTC m=+35.096399516 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.174624 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:42.174612133 +0000 UTC m=+35.096425177 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.209178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.209223 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.209236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.209261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.209278 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.311968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.312037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.312052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.312076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.312090 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.371031 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 16:16:02.595114926 +0000 UTC Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.416276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.416611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.416622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.416638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.416648 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.446709 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.446918 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.447595 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:34 crc kubenswrapper[4773]: E0120 18:30:34.447756 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.519628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.519711 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.519730 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.519760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.519779 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.626911 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.626980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.627000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.627024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.627041 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.689818 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.690201 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.693689 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" event={"ID":"7ddd5104-3112-413e-b908-2b7f336b41f1","Type":"ContainerStarted","Data":"52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.710196 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.715142 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.728844 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.729751 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.729806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.729822 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.729844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.729865 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.743656 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.761317 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.775858 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.792746 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.808900 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.823077 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.832804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.832963 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.833037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.833127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.833198 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.836856 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.852548 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.872809 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.890048 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.906237 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.925779 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.935966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.936009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.936022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.936040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.936052 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:34Z","lastTransitionTime":"2026-01-20T18:30:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.942872 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.962320 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.977132 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:34 crc kubenswrapper[4773]: I0120 18:30:34.990878 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.007831 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.024280 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.036754 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.038814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.038847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.038858 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.038874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.038884 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.058496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.072425 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.088183 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.104504 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.116117 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.125675 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.135743 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.140677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.140712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.140725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.140742 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.140755 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.151441 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.170417 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.242956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.243009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.243024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.243045 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.243058 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.345156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.345415 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.345498 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.345583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.345662 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.371539 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 09:42:27.231848579 +0000 UTC Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.446811 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:35 crc kubenswrapper[4773]: E0120 18:30:35.446996 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.448524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.448555 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.448569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.448584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.448596 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.551186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.551258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.551269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.551283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.551292 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.653583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.653619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.653629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.653643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.653656 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.696105 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.696280 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.721713 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.734042 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.745353 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.758310 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.759586 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.759639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.759657 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.759679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.759696 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.780257 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.795711 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.807731 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.822194 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.835563 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.845993 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.855536 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.862273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.862310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.862321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.862335 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.862345 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.872405 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.885348 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.910357 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.926567 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.941784 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.965071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.965117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.965129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.965148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:35 crc kubenswrapper[4773]: I0120 18:30:35.965161 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:35Z","lastTransitionTime":"2026-01-20T18:30:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.067804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.067849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.067859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.067873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.067881 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.169984 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.170033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.170047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.170065 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.170076 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.272767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.272837 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.272850 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.272867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.272879 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.372815 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 09:44:01.416023939 +0000 UTC Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.375894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.376056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.376134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.376205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.376259 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.446410 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.446893 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:36 crc kubenswrapper[4773]: E0120 18:30:36.447102 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:36 crc kubenswrapper[4773]: E0120 18:30:36.447330 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.478893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.479216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.479450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.479577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.479737 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.583013 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.583397 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.583536 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.583712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.583816 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.687110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.687156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.687170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.687195 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.687206 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.699200 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.789022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.789284 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.789471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.789631 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.789767 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.892990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.893023 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.893032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.893047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.893058 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.965706 4773 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.996314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.996372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.996387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.996413 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:36 crc kubenswrapper[4773]: I0120 18:30:36.996429 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:36Z","lastTransitionTime":"2026-01-20T18:30:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.099490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.099570 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.099596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.099629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.099654 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.203218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.203261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.203273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.203291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.203304 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.306035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.306075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.306086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.306103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.306115 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.373046 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 08:12:21.008170054 +0000 UTC Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.409822 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.409879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.409900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.409922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.409962 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.446749 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:37 crc kubenswrapper[4773]: E0120 18:30:37.446907 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.463385 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.475879 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.492299 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.511960 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.512466 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.512576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.512596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.512621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.512640 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.543617 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.563465 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.582570 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.606495 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.615094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.615136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.615150 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.615169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.615182 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.625181 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.649721 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.668455 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.701715 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.704570 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/0.log" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.708708 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85" exitCode=1 Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.708753 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.709434 4773 scope.go:117] "RemoveContainer" containerID="704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.721519 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.721553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.721562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.721576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.721585 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.722867 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.740048 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.759803 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.782334 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.797815 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.815571 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.829543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.829598 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.829607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.829626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.829635 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.834818 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.850903 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.864688 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.880104 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.894283 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.918911 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:36Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720069 6124 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:36.720003 6124 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720435 6124 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720630 6124 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720978 6124 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.721099 6124 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:36.721115 6124 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:36.721130 6124 factory.go:656] Stopping watch factory\\\\nI0120 18:30:36.721173 6124 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:36.721183 6124 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.933165 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.933226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.933238 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.933260 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.933274 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:37Z","lastTransitionTime":"2026-01-20T18:30:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.935883 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.951408 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.964557 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.978275 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:37 crc kubenswrapper[4773]: I0120 18:30:37.991908 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.002891 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.036824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.036910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.036955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.036981 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.036997 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.140012 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.140040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.140049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.140062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.140072 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.242567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.242656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.242679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.242708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.242731 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.344956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.345007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.345018 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.345033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.345044 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.373867 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 07:58:03.898779605 +0000 UTC Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.446060 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:38 crc kubenswrapper[4773]: E0120 18:30:38.446232 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.446284 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:38 crc kubenswrapper[4773]: E0120 18:30:38.446428 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.447860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.447891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.447903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.447917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.447934 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.550492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.550547 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.550559 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.550582 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.550596 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.652573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.652607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.652617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.652631 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.652641 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.713928 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/0.log" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.721702 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.721843 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.740399 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.754903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.754934 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.754943 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.754957 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.754981 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.757703 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.774781 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.789175 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.817074 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.831406 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.843746 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.857079 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.857411 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.857561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.857592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.857658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.857702 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.869426 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.883634 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.898113 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.911433 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.931591 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:36Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720069 6124 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:36.720003 6124 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720435 6124 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720630 6124 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720978 6124 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.721099 6124 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:36.721115 6124 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:36.721130 6124 factory.go:656] Stopping watch factory\\\\nI0120 18:30:36.721173 6124 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:36.721183 6124 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.948761 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.960614 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.960645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.960655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.960689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.960703 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:38Z","lastTransitionTime":"2026-01-20T18:30:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:38 crc kubenswrapper[4773]: I0120 18:30:38.966250 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:38Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.014033 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.032823 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.063748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.063785 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.063794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.063811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.063821 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.066096 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.081861 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.095681 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.114396 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.136219 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.155667 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.168982 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.169049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.169067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.169094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.169118 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.170895 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.190994 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:36Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720069 6124 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:36.720003 6124 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720435 6124 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720630 6124 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720978 6124 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.721099 6124 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:36.721115 6124 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:36.721130 6124 factory.go:656] Stopping watch factory\\\\nI0120 18:30:36.721173 6124 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:36.721183 6124 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.206349 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.221666 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.244725 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.263974 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.272859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.272902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.272916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.272955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.272972 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.280809 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.294607 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.374066 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 18:40:38.021838154 +0000 UTC Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.378917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.378979 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.378989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.379010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.379025 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.446331 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:39 crc kubenswrapper[4773]: E0120 18:30:39.446674 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.483404 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.483484 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.483508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.483537 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.483558 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.587103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.587177 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.587202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.587235 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.587259 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.690972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.691022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.691033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.691052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.691067 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.730281 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/1.log" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.731580 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/0.log" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.735822 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07" exitCode=1 Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.735878 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.736008 4773 scope.go:117] "RemoveContainer" containerID="704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.737473 4773 scope.go:117] "RemoveContainer" containerID="04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07" Jan 20 18:30:39 crc kubenswrapper[4773]: E0120 18:30:39.737846 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.761305 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.777210 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.794830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.794894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.794911 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.794958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.794974 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.795877 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.813645 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.840795 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.866631 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:36Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720069 6124 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:36.720003 6124 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720435 6124 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720630 6124 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720978 6124 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.721099 6124 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:36.721115 6124 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:36.721130 6124 factory.go:656] Stopping watch factory\\\\nI0120 18:30:36.721173 6124 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:36.721183 6124 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.885305 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.898064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.898168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.898192 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.898253 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.898273 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:39Z","lastTransitionTime":"2026-01-20T18:30:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.908121 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.922121 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.934254 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.945566 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.955620 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.974818 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:39 crc kubenswrapper[4773]: I0120 18:30:39.991079 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:39Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.001609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.001659 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.001669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.001683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.001691 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.003432 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.104761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.104822 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.104844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.104873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.104897 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.203082 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k"] Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.203764 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.207064 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.207658 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.209437 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.210011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.210086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.210150 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.210177 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.237702 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.256151 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.271328 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.289065 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.307618 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.312976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.313056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.313073 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.313132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.313148 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.320496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.334148 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.362845 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://704f7cbd7bbc550f70fb00a77c807764dfbc4b7be701efa1b8730900d51a8f85\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:36Z\\\",\\\"message\\\":\\\"Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720069 6124 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:36.720003 6124 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:36.720435 6124 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720630 6124 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.720978 6124 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:36.721099 6124 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:36.721115 6124 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:36.721130 6124 factory.go:656] Stopping watch factory\\\\nI0120 18:30:36.721173 6124 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:36.721183 6124 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.363589 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7821f5e-4734-489f-bcf9-910b875a4848-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.363657 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7821f5e-4734-489f-bcf9-910b875a4848-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.363706 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7821f5e-4734-489f-bcf9-910b875a4848-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.363751 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcp99\" (UniqueName: \"kubernetes.io/projected/d7821f5e-4734-489f-bcf9-910b875a4848-kube-api-access-lcp99\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.374756 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 18:34:35.743154853 +0000 UTC Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.377799 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.391001 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.404786 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.416028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.416059 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.416071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.416088 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.416099 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.417242 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.429445 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.440839 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.446063 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.446075 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:40 crc kubenswrapper[4773]: E0120 18:30:40.446158 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:40 crc kubenswrapper[4773]: E0120 18:30:40.446393 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.457416 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.465303 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7821f5e-4734-489f-bcf9-910b875a4848-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.465418 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7821f5e-4734-489f-bcf9-910b875a4848-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.465486 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcp99\" (UniqueName: \"kubernetes.io/projected/d7821f5e-4734-489f-bcf9-910b875a4848-kube-api-access-lcp99\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.465595 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7821f5e-4734-489f-bcf9-910b875a4848-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.466353 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d7821f5e-4734-489f-bcf9-910b875a4848-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.466673 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d7821f5e-4734-489f-bcf9-910b875a4848-env-overrides\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.469663 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.479474 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d7821f5e-4734-489f-bcf9-910b875a4848-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.488375 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcp99\" (UniqueName: \"kubernetes.io/projected/d7821f5e-4734-489f-bcf9-910b875a4848-kube-api-access-lcp99\") pod \"ovnkube-control-plane-749d76644c-gbn6k\" (UID: \"d7821f5e-4734-489f-bcf9-910b875a4848\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.519187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.519273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.519292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.519321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.519341 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.526506 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" Jan 20 18:30:40 crc kubenswrapper[4773]: W0120 18:30:40.544976 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7821f5e_4734_489f_bcf9_910b875a4848.slice/crio-204388184de819fe54a68d2183681a7223e09e2d5b5ac9fac5ba087adcb90d98 WatchSource:0}: Error finding container 204388184de819fe54a68d2183681a7223e09e2d5b5ac9fac5ba087adcb90d98: Status 404 returned error can't find the container with id 204388184de819fe54a68d2183681a7223e09e2d5b5ac9fac5ba087adcb90d98 Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.623267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.623310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.623322 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.623391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.623405 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.725250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.725307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.725325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.725349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.725368 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.739694 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" event={"ID":"d7821f5e-4734-489f-bcf9-910b875a4848","Type":"ContainerStarted","Data":"204388184de819fe54a68d2183681a7223e09e2d5b5ac9fac5ba087adcb90d98"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.743221 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/1.log" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.746660 4773 scope.go:117] "RemoveContainer" containerID="04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07" Jan 20 18:30:40 crc kubenswrapper[4773]: E0120 18:30:40.746829 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.758925 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.770298 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.782490 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.794469 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.803217 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.812910 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.824104 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.827882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.827916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.827928 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.827957 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.827969 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.846273 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.861760 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.873445 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.892189 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.913174 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.931815 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.932888 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.932925 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.932937 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.932977 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.932993 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:40Z","lastTransitionTime":"2026-01-20T18:30:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.945173 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-4jpbd"] Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.945615 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:40 crc kubenswrapper[4773]: E0120 18:30:40.945678 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.949851 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.962452 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:40 crc kubenswrapper[4773]: I0120 18:30:40.992747 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:40Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.008958 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.026607 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.035361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.035402 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.035414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.035432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.035442 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.045308 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.065190 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.072251 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.072309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66lgr\" (UniqueName: \"kubernetes.io/projected/3791c4b7-dcef-470d-a67e-a2c0bb004436-kube-api-access-66lgr\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.086521 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.119534 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.138706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.138784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.138801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.138832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.138849 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.142116 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.154647 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.171051 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.173778 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.173989 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66lgr\" (UniqueName: \"kubernetes.io/projected/3791c4b7-dcef-470d-a67e-a2c0bb004436-kube-api-access-66lgr\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:41 crc kubenswrapper[4773]: E0120 18:30:41.174040 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:41 crc kubenswrapper[4773]: E0120 18:30:41.174336 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:41.674303343 +0000 UTC m=+34.596116377 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.190217 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.198477 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66lgr\" (UniqueName: \"kubernetes.io/projected/3791c4b7-dcef-470d-a67e-a2c0bb004436-kube-api-access-66lgr\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.210643 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.224664 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.240858 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.242057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.242134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.242156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.242193 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.242217 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.255543 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.271893 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.291157 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.306855 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.346460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.346513 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.346524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.346542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.346554 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.374906 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 08:41:27.866268921 +0000 UTC Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.446634 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:41 crc kubenswrapper[4773]: E0120 18:30:41.446867 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.449609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.449663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.449675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.449690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.449702 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.552629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.552692 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.552710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.552737 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.552756 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.655061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.655102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.655115 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.655131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.655145 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.680914 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:41 crc kubenswrapper[4773]: E0120 18:30:41.681145 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:41 crc kubenswrapper[4773]: E0120 18:30:41.681211 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:42.681195446 +0000 UTC m=+35.603008480 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.753574 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" event={"ID":"d7821f5e-4734-489f-bcf9-910b875a4848","Type":"ContainerStarted","Data":"f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.753869 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" event={"ID":"d7821f5e-4734-489f-bcf9-910b875a4848","Type":"ContainerStarted","Data":"f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.757632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.757678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.757691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.757707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.757720 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.770357 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.784331 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.800153 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.816713 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.831088 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.843132 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.860482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.860558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.860582 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.860613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.860636 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.865156 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.888077 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.901319 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.927573 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.943043 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.957799 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.963213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.963269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.963280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.963302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.963318 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:41Z","lastTransitionTime":"2026-01-20T18:30:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.972492 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:41 crc kubenswrapper[4773]: I0120 18:30:41.989517 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:41Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.002501 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:42Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.019448 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:42Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.030479 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:42Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.066510 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.066634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.066684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.066705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.066718 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.086011 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.086235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.086329 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:30:58.086269912 +0000 UTC m=+51.008082976 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.086442 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.086647 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:58.08662947 +0000 UTC m=+51.008442534 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.182897 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.182969 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.182981 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.182997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.183008 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.187492 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.187544 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.187570 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187709 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187716 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187733 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187744 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187752 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187758 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187805 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:58.187788932 +0000 UTC m=+51.109601976 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187829 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:58.187817872 +0000 UTC m=+51.109630906 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187832 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.187879 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:58.187860103 +0000 UTC m=+51.109673137 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.286225 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.286278 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.286289 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.286308 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.286320 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.376208 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 19:40:30.220960774 +0000 UTC Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.390450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.390490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.390499 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.390516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.390527 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.446959 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.447076 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.447182 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.447076 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.447247 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.447374 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.494150 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.494218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.494237 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.494263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.494283 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.598324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.598411 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.598432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.598460 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.598483 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.695092 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.695347 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.695497 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:44.695463053 +0000 UTC m=+37.617276107 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.703573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.703636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.703654 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.703682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.703701 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.808685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.808758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.808781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.808810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.808832 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.912882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.912978 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.913001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.913027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.913047 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.935506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.935558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.935610 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.935637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.935659 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.957423 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:42Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.963384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.963452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.963472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.963508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.963529 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:42 crc kubenswrapper[4773]: E0120 18:30:42.989618 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:42Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.994621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.994693 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.994710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.994733 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:42 crc kubenswrapper[4773]: I0120 18:30:42.994747 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:42Z","lastTransitionTime":"2026-01-20T18:30:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: E0120 18:30:43.011802 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:43Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.017406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.017489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.017505 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.017524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.017538 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: E0120 18:30:43.036797 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:43Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.042356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.042445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.042469 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.042504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.042529 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: E0120 18:30:43.064678 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:43Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:43Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:43 crc kubenswrapper[4773]: E0120 18:30:43.064806 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.066707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.066755 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.066769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.066791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.066810 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.170240 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.170287 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.170300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.170315 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.170327 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.273262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.273317 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.273329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.273345 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.273355 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.376077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.376139 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.376148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.376167 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.376177 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.376361 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 01:16:13.469981906 +0000 UTC Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.446167 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:43 crc kubenswrapper[4773]: E0120 18:30:43.446316 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.479719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.479770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.479780 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.479799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.479808 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.583558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.583665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.583686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.583720 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.583741 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.687465 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.687552 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.687577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.687606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.687625 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.790832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.790872 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.790879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.790892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.790902 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.894408 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.894487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.894507 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.894538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.894562 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.997703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.997793 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.997817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.997847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:43 crc kubenswrapper[4773]: I0120 18:30:43.997867 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:43Z","lastTransitionTime":"2026-01-20T18:30:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.101255 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.101304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.101316 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.101331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.101342 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.205169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.205247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.205273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.205309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.205332 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.309972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.310022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.310040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.310066 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.310087 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.376977 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 12:52:24.742014989 +0000 UTC Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.413487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.413541 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.413553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.413573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.413591 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.446277 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.446354 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.446316 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:44 crc kubenswrapper[4773]: E0120 18:30:44.446500 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:44 crc kubenswrapper[4773]: E0120 18:30:44.446609 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:44 crc kubenswrapper[4773]: E0120 18:30:44.446749 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.516812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.516846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.516855 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.516885 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.516896 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.623983 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.624038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.624048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.624065 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.624076 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.718305 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:44 crc kubenswrapper[4773]: E0120 18:30:44.718630 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:44 crc kubenswrapper[4773]: E0120 18:30:44.718768 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:48.718733475 +0000 UTC m=+41.640546539 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.727548 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.727621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.727643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.727676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.727697 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.830803 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.830870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.830889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.830912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.830938 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.934192 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.934246 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.934258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.934280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:44 crc kubenswrapper[4773]: I0120 18:30:44.934295 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:44Z","lastTransitionTime":"2026-01-20T18:30:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.037488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.037538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.037550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.037569 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.037582 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.141186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.141258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.141279 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.141305 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.141326 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.244542 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.244672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.244786 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.244821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.244843 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.347459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.347528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.347543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.347561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.347600 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.377156 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 22:41:43.738132095 +0000 UTC Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.446225 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:45 crc kubenswrapper[4773]: E0120 18:30:45.446460 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.450426 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.450506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.450531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.450568 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.450592 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.553802 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.553851 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.553867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.553889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.553905 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.656998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.657052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.657067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.657090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.657106 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.760243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.760314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.760328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.760350 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.760365 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.863510 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.863576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.863592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.863615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.863632 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.967233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.967333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.967359 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.967394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:45 crc kubenswrapper[4773]: I0120 18:30:45.967432 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:45Z","lastTransitionTime":"2026-01-20T18:30:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.070999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.071057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.071069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.071091 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.071105 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.174432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.174498 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.174509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.174530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.174543 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.278306 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.278388 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.278412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.278446 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.278473 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.377844 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 11:48:07.417897546 +0000 UTC Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.381414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.381572 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.381591 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.381614 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.381630 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.446635 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.446672 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.446770 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:46 crc kubenswrapper[4773]: E0120 18:30:46.447005 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:46 crc kubenswrapper[4773]: E0120 18:30:46.447094 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:46 crc kubenswrapper[4773]: E0120 18:30:46.447184 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.485506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.485578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.485592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.485638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.485652 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.588144 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.588198 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.588233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.588254 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.588268 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.691203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.691280 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.691310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.691339 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.691353 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.795325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.795374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.795385 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.795402 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.795414 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.898580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.898668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.898687 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.898714 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:46 crc kubenswrapper[4773]: I0120 18:30:46.898736 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:46Z","lastTransitionTime":"2026-01-20T18:30:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.001692 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.001762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.001781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.001800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.001813 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.104758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.104812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.104827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.104845 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.104859 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.208740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.208802 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.208813 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.208833 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.208845 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.311563 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.311658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.311684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.311722 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.311754 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.378068 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 14:50:42.576655323 +0000 UTC Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.414230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.414275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.414283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.414299 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.414309 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.447040 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:47 crc kubenswrapper[4773]: E0120 18:30:47.447207 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.474442 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.497884 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.517154 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.517971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.518052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.518076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.518105 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.518129 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.554419 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.570881 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.588076 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.605792 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.620336 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.621491 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.621525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.621538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.621556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.621567 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.631709 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.653138 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.666628 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.679307 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.697773 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.712548 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.724532 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.724725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.724795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.724870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.724981 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.729579 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.744195 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.754898 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:47Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.827666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.827740 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.827760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.827789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.827809 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.931246 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.931314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.931333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.931360 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:47 crc kubenswrapper[4773]: I0120 18:30:47.931379 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:47Z","lastTransitionTime":"2026-01-20T18:30:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.034477 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.034522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.034531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.034546 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.034556 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.137055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.137101 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.137113 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.137129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.137145 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.240615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.240669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.240686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.240708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.240728 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.344719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.344754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.344765 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.344782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.344793 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.378238 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 21:21:57.672306348 +0000 UTC Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.446037 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.446076 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.446091 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:48 crc kubenswrapper[4773]: E0120 18:30:48.446288 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:48 crc kubenswrapper[4773]: E0120 18:30:48.446434 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:48 crc kubenswrapper[4773]: E0120 18:30:48.446567 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.448274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.448302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.448313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.448327 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.448340 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.550916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.551025 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.551051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.551081 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.551104 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.654524 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.654607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.654637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.654667 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.654687 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.758618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.758675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.758693 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.758716 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.758732 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.767032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:48 crc kubenswrapper[4773]: E0120 18:30:48.767175 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:48 crc kubenswrapper[4773]: E0120 18:30:48.767261 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:30:56.767236177 +0000 UTC m=+49.689049241 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.861794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.861857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.861874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.861901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.861918 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.964634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.964685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.964697 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.964714 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:48 crc kubenswrapper[4773]: I0120 18:30:48.964727 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:48Z","lastTransitionTime":"2026-01-20T18:30:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.068279 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.068348 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.068365 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.068392 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.068412 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.171064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.171125 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.171143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.171168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.171184 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.274155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.274201 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.274213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.274229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.274243 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.377381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.377426 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.377435 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.377453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.377468 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.378725 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 04:46:06.925714397 +0000 UTC Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.446338 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:49 crc kubenswrapper[4773]: E0120 18:30:49.446526 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.480600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.480651 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.480726 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.480751 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.480772 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.584247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.584311 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.584328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.584353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.584372 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.688221 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.688328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.688349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.688372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.688390 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.791441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.791505 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.791528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.791566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.791590 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.896109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.896182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.896205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.896233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.896254 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.998908 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.998985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.999003 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.999024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:49 crc kubenswrapper[4773]: I0120 18:30:49.999041 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:49Z","lastTransitionTime":"2026-01-20T18:30:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.102641 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.102710 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.102732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.102761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.102782 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.206376 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.206441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.206461 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.206485 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.206501 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.309272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.309364 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.309381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.309410 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.309428 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.341606 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.342868 4773 scope.go:117] "RemoveContainer" containerID="04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.379158 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 09:09:54.658737128 +0000 UTC Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.414319 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.414381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.414400 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.414424 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.414442 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.446715 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.446753 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.446799 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:50 crc kubenswrapper[4773]: E0120 18:30:50.446997 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:50 crc kubenswrapper[4773]: E0120 18:30:50.447185 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:50 crc kubenswrapper[4773]: E0120 18:30:50.447341 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.518141 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.518203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.518215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.518236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.518248 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.622127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.622203 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.622222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.622248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.622265 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.726022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.726062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.726078 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.726100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.726116 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.787911 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/1.log" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.790632 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.791533 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.806605 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.819619 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.828778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.828807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.828817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.828829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.828838 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.840123 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.854188 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.866700 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.878968 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.907976 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.928232 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.930952 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.930986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.930997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.931011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.931020 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:50Z","lastTransitionTime":"2026-01-20T18:30:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.947767 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.977164 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:50 crc kubenswrapper[4773]: I0120 18:30:50.989781 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:50Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.001881 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.014118 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.028092 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.033075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.033111 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.033120 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.033135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.033145 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.038262 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.057097 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.068336 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.135549 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.135613 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.135629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.135656 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.135672 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.238331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.238365 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.238375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.238392 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.238403 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.341177 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.341223 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.341233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.341250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.341260 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.379729 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 03:53:48.388164588 +0000 UTC Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.443897 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.443985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.443998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.444014 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.444024 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.446432 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:51 crc kubenswrapper[4773]: E0120 18:30:51.446544 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.546873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.546925 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.546970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.546996 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.547027 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.649281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.649342 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.649355 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.649373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.649390 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.752680 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.752731 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.752746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.752766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.752782 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.797113 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/2.log" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.798139 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/1.log" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.802130 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d" exitCode=1 Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.802178 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.802217 4773 scope.go:117] "RemoveContainer" containerID="04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.803258 4773 scope.go:117] "RemoveContainer" containerID="9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d" Jan 20 18:30:51 crc kubenswrapper[4773]: E0120 18:30:51.803556 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.818600 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.838313 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.855494 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.855707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.855824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.855918 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.856025 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.864179 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.886161 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.914738 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.932431 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.953063 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.958830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.958884 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.958896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.958917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.958948 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:51Z","lastTransitionTime":"2026-01-20T18:30:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:51 crc kubenswrapper[4773]: I0120 18:30:51.978692 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:51Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.004221 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.023532 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.041364 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.062135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.062198 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.062215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.062239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.062255 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.068137 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://04c27d9920ca807f9dedb59eb78d9e2a7bd2ec92d8a5a61180ee528fdee3ef07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"message\\\":\\\"ved *v1.Node event handler 7\\\\nI0120 18:30:38.824577 6252 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:30:38.824586 6252 handler.go:208] Removed *v1.Node event handler 2\\\\nI0120 18:30:38.824768 6252 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.824989 6252 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0120 18:30:38.825056 6252 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:30:38.825067 6252 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825072 6252 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:30:38.825090 6252 factory.go:656] Stopping watch factory\\\\nI0120 18:30:38.825107 6252 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0120 18:30:38.825115 6252 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:30:38.825243 6252 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:30:38.825272 6252 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/f\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.083973 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.102271 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.117564 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.133267 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.150201 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.165070 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.165116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.165127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.165152 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.165163 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.268197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.268271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.268296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.268331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.268356 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.371890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.371956 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.371968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.371985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.371997 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.380123 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 13:23:42.791905919 +0000 UTC Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.446538 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:52 crc kubenswrapper[4773]: E0120 18:30:52.446661 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.446559 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:52 crc kubenswrapper[4773]: E0120 18:30:52.446729 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.446538 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:52 crc kubenswrapper[4773]: E0120 18:30:52.447080 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.475104 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.475176 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.475210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.475239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.475258 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.578194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.578233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.578243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.578260 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.578272 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.680628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.680667 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.680677 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.680691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.680701 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.783216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.783256 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.783267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.783287 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.783302 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.807763 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/2.log" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.811661 4773 scope.go:117] "RemoveContainer" containerID="9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d" Jan 20 18:30:52 crc kubenswrapper[4773]: E0120 18:30:52.811811 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.841730 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.858960 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.874562 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.886301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.886368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.886391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.886420 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.886442 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.898696 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.922617 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.935659 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.953102 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.967875 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.988671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.988724 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.988735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.988754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.988768 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:52Z","lastTransitionTime":"2026-01-20T18:30:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:52 crc kubenswrapper[4773]: I0120 18:30:52.998284 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:52Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.016839 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.037734 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.059045 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.077583 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.090838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.090886 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.090897 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.090917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.090960 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.098170 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.109134 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.120112 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.135889 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.193590 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.193630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.193641 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.193675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.193686 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.292064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.292138 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.292153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.292180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.292199 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.307807 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.311064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.311102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.311114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.311131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.311144 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.323716 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.327633 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.327679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.327695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.327716 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.327732 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.340859 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.343625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.343658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.343669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.343683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.343694 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.354304 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.357294 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.357331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.357342 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.357358 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.357369 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.371147 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:53Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:53Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.371268 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.372741 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.372777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.372787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.372801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.372811 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.380980 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 06:01:13.589640059 +0000 UTC Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.446603 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:53 crc kubenswrapper[4773]: E0120 18:30:53.446823 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.475394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.475433 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.475444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.475459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.475470 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.577672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.577717 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.577730 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.577744 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.577758 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.679707 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.679752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.679763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.679777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.679786 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.781670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.781727 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.781739 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.781756 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.781768 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.884775 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.884817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.884826 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.884841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.884851 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.987033 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.987069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.987080 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.987095 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:53 crc kubenswrapper[4773]: I0120 18:30:53.987105 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:53Z","lastTransitionTime":"2026-01-20T18:30:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.090127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.090171 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.090184 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.090201 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.090213 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.192761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.192796 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.192808 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.192831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.192845 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.296035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.296064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.296071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.296084 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.296091 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.382021 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 20:54:49.631810337 +0000 UTC Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.398881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.398904 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.398912 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.398923 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.398956 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.446553 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:54 crc kubenswrapper[4773]: E0120 18:30:54.446684 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.447063 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:54 crc kubenswrapper[4773]: E0120 18:30:54.447110 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.446504 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:54 crc kubenswrapper[4773]: E0120 18:30:54.447227 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.501048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.501106 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.501118 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.501136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.501151 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.603584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.603621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.603629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.603643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.603654 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.706732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.706777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.706789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.706805 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.706818 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.810187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.810230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.810243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.810262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.810275 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.913746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.913806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.913819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.913834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:54 crc kubenswrapper[4773]: I0120 18:30:54.913845 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:54Z","lastTransitionTime":"2026-01-20T18:30:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.016337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.016371 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.016379 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.016391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.016399 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.118814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.118862 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.118874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.118893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.118907 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.221758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.221812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.221824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.221841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.221851 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.325459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.325525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.325537 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.325552 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.325564 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.382821 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 22:09:05.717255496 +0000 UTC Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.427978 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.428018 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.428027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.428040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.428049 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.446947 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:55 crc kubenswrapper[4773]: E0120 18:30:55.447083 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.530782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.530847 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.530866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.530891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.530914 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.634295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.634361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.634374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.634392 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.634403 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.736825 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.736917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.736973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.737010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.737034 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.839725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.839845 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.839873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.839902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.839928 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.942432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.942503 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.942520 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.942544 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:55 crc kubenswrapper[4773]: I0120 18:30:55.942561 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:55Z","lastTransitionTime":"2026-01-20T18:30:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.046815 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.046868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.046888 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.046914 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.047063 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.150833 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.150900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.150922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.150997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.151017 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.254563 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.254640 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.254661 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.254693 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.254718 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.357628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.357675 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.357694 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.357720 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.357744 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.383259 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-10 17:38:19.362707641 +0000 UTC Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.446972 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.447013 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:56 crc kubenswrapper[4773]: E0120 18:30:56.447207 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.447325 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:56 crc kubenswrapper[4773]: E0120 18:30:56.447613 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:56 crc kubenswrapper[4773]: E0120 18:30:56.447841 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.460663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.460712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.460722 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.460736 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.460747 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.563079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.563145 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.563158 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.563180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.563199 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.667884 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.667941 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.667954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.667982 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.667994 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.770795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.770867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.770890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.770919 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.770985 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.855398 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:56 crc kubenswrapper[4773]: E0120 18:30:56.855585 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:56 crc kubenswrapper[4773]: E0120 18:30:56.855638 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:31:12.855625778 +0000 UTC m=+65.777438802 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.873762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.873827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.873849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.873877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.873896 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.977111 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.977166 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.977179 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.977200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:56 crc kubenswrapper[4773]: I0120 18:30:56.977214 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:56Z","lastTransitionTime":"2026-01-20T18:30:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.080077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.080164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.080188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.080226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.080267 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.184143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.184197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.184211 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.184233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.184245 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.286983 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.287023 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.287032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.287049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.287061 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.383947 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-29 04:22:37.333042535 +0000 UTC Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.390598 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.390660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.390678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.390704 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.390725 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.447124 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:57 crc kubenswrapper[4773]: E0120 18:30:57.447316 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.465312 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.480038 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.493808 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.493852 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.493866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.493890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.493905 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.509443 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.528623 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.550735 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.571287 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.594172 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.597013 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.597069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.597084 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.597102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.597113 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.613396 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.630077 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.663493 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.681964 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.700429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.700495 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.700517 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.700546 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.700567 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.705881 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.723208 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.740912 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.765884 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.796776 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.803512 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.803577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.803599 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.803632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.803662 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.818989 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:57Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.906151 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.906215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.906234 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.906261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:57 crc kubenswrapper[4773]: I0120 18:30:57.906280 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:57Z","lastTransitionTime":"2026-01-20T18:30:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.009834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.009893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.009910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.009966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.009988 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.112653 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.112718 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.112737 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.112766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.112788 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.170694 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.170885 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.171171 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.171316 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:31:30.171279992 +0000 UTC m=+83.093093046 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.171810 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:31:30.171784123 +0000 UTC m=+83.093597207 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.216819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.216872 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.216883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.216901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.216911 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.272664 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.272755 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.272797 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273109 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273140 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273172 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273211 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273275 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:31:30.273237742 +0000 UTC m=+83.195050806 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273131 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273306 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:31:30.273292853 +0000 UTC m=+83.195105917 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273323 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273347 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.273431 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:31:30.273403696 +0000 UTC m=+83.195216770 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.320147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.320450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.320684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.321160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.321433 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.384624 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 01:24:52.751688167 +0000 UTC Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.424177 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.424612 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.424849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.425116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.425340 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.446630 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.446757 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.446856 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.447018 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.447194 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:30:58 crc kubenswrapper[4773]: E0120 18:30:58.447446 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.528189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.528284 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.528307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.528344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.528374 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.632207 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.632298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.632324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.632358 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.632392 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.736547 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.736598 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.736607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.736625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.736638 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.840518 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.840579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.840596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.840623 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.840644 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.944281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.944664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.944921 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.945155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:58 crc kubenswrapper[4773]: I0120 18:30:58.945282 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:58Z","lastTransitionTime":"2026-01-20T18:30:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.058244 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.058336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.058366 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.058401 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.058426 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.162079 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.162130 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.162147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.162170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.162187 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.266120 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.266191 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.266210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.266239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.266261 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.370607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.370663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.370679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.370703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.370720 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.386142 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 23:53:13.519491254 +0000 UTC Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.446364 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:30:59 crc kubenswrapper[4773]: E0120 18:30:59.446499 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.474224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.474279 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.474298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.474323 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.474343 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.577879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.577955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.577968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.577986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.578000 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.681365 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.681392 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.681401 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.681413 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.681423 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.784717 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.785223 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.785375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.785508 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.785625 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.818879 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.838729 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.847787 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.871150 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.883675 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.888721 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.888770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.888783 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.888804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.888818 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.897903 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.920913 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.937393 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.949672 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.971637 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.982792 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.990987 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.991024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.991037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.991056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.991068 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:30:59Z","lastTransitionTime":"2026-01-20T18:30:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:30:59 crc kubenswrapper[4773]: I0120 18:30:59.993102 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:30:59Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.004207 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.016023 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.025879 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.043578 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.053359 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.067055 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.079679 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:00Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.094517 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.094584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.094606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.094634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.094655 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.198739 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.198801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.198818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.198841 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.198857 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.301525 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.301860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.301974 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.302076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.302170 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.386995 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 20:46:12.428713536 +0000 UTC Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.404778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.404827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.404843 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.404866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.404883 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.446915 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.446993 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.447089 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:00 crc kubenswrapper[4773]: E0120 18:31:00.447858 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:00 crc kubenswrapper[4773]: E0120 18:31:00.447615 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:00 crc kubenswrapper[4773]: E0120 18:31:00.448032 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.507558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.508009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.508169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.508334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.508502 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.612241 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.612310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.612328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.612353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.612371 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.715590 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.715674 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.715718 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.715756 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.715780 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.819687 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.819763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.819782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.819812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.819835 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.923764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.924543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.924755 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.924909 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:00 crc kubenswrapper[4773]: I0120 18:31:00.925127 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:00Z","lastTransitionTime":"2026-01-20T18:31:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.028073 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.028143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.028161 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.028186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.028203 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.132142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.132205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.132222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.132255 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.132277 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.236355 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.237054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.237100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.237123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.237141 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.340089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.340144 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.340158 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.340180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.340196 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.387697 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 09:09:02.480745908 +0000 UTC Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.443446 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.443488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.443500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.443516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.443528 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.445951 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:01 crc kubenswrapper[4773]: E0120 18:31:01.446082 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.546220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.546274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.546292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.546314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.546330 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.649646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.649689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.649700 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.649716 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.649728 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.753337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.753394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.753406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.753425 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.753441 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.856911 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.857472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.857659 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.857814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.857982 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.961854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.961976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.961999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.962028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:01 crc kubenswrapper[4773]: I0120 18:31:01.962047 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:01Z","lastTransitionTime":"2026-01-20T18:31:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.065069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.065109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.065120 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.065160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.065173 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.169057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.169126 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.169142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.169167 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.169189 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.274155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.274239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.274268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.274304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.274329 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.378253 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.378314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.378327 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.378347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.378362 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.388150 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 01:21:41.316565099 +0000 UTC Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.447340 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.447409 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:02 crc kubenswrapper[4773]: E0120 18:31:02.447501 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.447335 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:02 crc kubenswrapper[4773]: E0120 18:31:02.447686 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:02 crc kubenswrapper[4773]: E0120 18:31:02.447754 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.482148 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.482226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.482248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.482276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.482298 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.586236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.586320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.586334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.586354 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.586368 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.689840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.689901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.689985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.690025 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.690048 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.793112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.793168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.793178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.793193 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.793204 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.896455 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.896510 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.896529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.896556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:02 crc kubenswrapper[4773]: I0120 18:31:02.896577 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:02Z","lastTransitionTime":"2026-01-20T18:31:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.000227 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.000286 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.000304 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.000325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.000339 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.103605 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.103643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.103672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.103686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.103696 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.206711 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.206767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.206810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.206834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.206851 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.309626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.309671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.309682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.309703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.309716 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.388495 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 14:20:50.957193147 +0000 UTC Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.412986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.413057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.413076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.413105 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.413126 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.446873 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.447183 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.516700 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.516805 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.516830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.516863 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.516887 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.620414 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.620453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.620463 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.620478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.620488 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.661357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.661450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.661607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.661657 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.661680 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.680293 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:03Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.685589 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.685639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.685651 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.685676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.685692 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.702044 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:03Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.706451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.706732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.706865 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.707034 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.707170 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.728596 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:03Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.734191 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.734450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.734612 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.734789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.734992 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.756916 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:03Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.762991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.763189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.763333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.763468 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.763592 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.786440 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:03Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:03Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:03 crc kubenswrapper[4773]: E0120 18:31:03.787279 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.789877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.789970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.789986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.790007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.790022 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.893632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.893704 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.893715 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.893750 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.893766 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.996773 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.996844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.996863 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.996892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:03 crc kubenswrapper[4773]: I0120 18:31:03.996912 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:03Z","lastTransitionTime":"2026-01-20T18:31:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.099493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.099543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.099551 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.099566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.099575 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.203401 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.203474 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.203493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.203521 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.203540 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.307500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.307566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.307585 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.307611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.307630 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.389569 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-12 06:19:07.282423754 +0000 UTC Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.412074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.412380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.412579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.412776 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.412917 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.446264 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.446596 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:04 crc kubenswrapper[4773]: E0120 18:31:04.446762 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.446887 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:04 crc kubenswrapper[4773]: E0120 18:31:04.447207 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:04 crc kubenswrapper[4773]: E0120 18:31:04.446975 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.517107 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.517198 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.517245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.517274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.517328 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.620484 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.621086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.621241 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.621383 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.621732 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.724015 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.724100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.724168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.724220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.724235 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.828009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.828136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.828156 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.828182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.828199 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.931054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.931118 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.931132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.931155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:04 crc kubenswrapper[4773]: I0120 18:31:04.931170 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:04Z","lastTransitionTime":"2026-01-20T18:31:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.034123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.034194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.034215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.034276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.034296 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.138039 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.138464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.138538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.138665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.138777 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.241592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.241687 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.241709 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.241739 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.241759 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.345230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.345324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.345353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.345389 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.345413 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.390550 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 12:28:58.994119857 +0000 UTC Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.446369 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:05 crc kubenswrapper[4773]: E0120 18:31:05.446784 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.448160 4773 scope.go:117] "RemoveContainer" containerID="9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d" Jan 20 18:31:05 crc kubenswrapper[4773]: E0120 18:31:05.448625 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.450550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.450611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.450637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.450666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.450688 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.554811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.554902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.554972 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.555052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.555082 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.659490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.659583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.659610 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.659646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.659673 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.763828 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.763903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.763922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.763985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.764007 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.867871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.867982 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.868000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.868032 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.868049 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.972599 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.972664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.972682 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.972711 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:05 crc kubenswrapper[4773]: I0120 18:31:05.972732 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:05Z","lastTransitionTime":"2026-01-20T18:31:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.075504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.075571 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.075593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.075615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.075633 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.179389 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.179439 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.179453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.179481 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.179499 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.283009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.283343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.283429 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.283512 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.283574 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.386384 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.387016 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.387247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.387478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.387680 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.391533 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 03:17:33.01024901 +0000 UTC Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.446552 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.446552 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.446590 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:06 crc kubenswrapper[4773]: E0120 18:31:06.447469 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:06 crc kubenswrapper[4773]: E0120 18:31:06.447631 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:06 crc kubenswrapper[4773]: E0120 18:31:06.447926 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.492101 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.492200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.492218 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.492247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.492266 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.597185 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.597263 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.597286 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.597318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.597337 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.701382 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.701763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.701834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.701946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.702023 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.805715 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.806126 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.806215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.806309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.806381 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.909632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.909705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.909723 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.909750 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:06 crc kubenswrapper[4773]: I0120 18:31:06.909804 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:06Z","lastTransitionTime":"2026-01-20T18:31:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.013326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.013415 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.013439 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.013472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.013497 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.117405 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.117471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.117490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.117523 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.117547 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.221603 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.221692 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.221719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.221755 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.221781 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.325340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.325457 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.325477 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.325509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.325530 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.391652 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-15 05:05:58.576393588 +0000 UTC Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.428539 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.428987 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.429245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.429432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.429652 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.446283 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:07 crc kubenswrapper[4773]: E0120 18:31:07.446539 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.472155 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.494682 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.511706 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.533995 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.534058 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.534077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.534110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.534132 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.534714 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.550542 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.567965 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.584521 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.601332 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.617790 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.639010 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.645593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.645665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.645683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.645712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.645731 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.659740 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.682161 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.696878 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.718619 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.734367 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.748898 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.749375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.749388 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.749406 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.749420 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.750149 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.765759 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.777960 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:07Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.851619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.851661 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.851671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.851685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.851694 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.953821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.953859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.953867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.953878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:07 crc kubenswrapper[4773]: I0120 18:31:07.953887 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:07Z","lastTransitionTime":"2026-01-20T18:31:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.056509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.056556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.056568 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.056583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.056593 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.159382 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.159457 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.159485 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.159519 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.159547 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.262192 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.262292 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.262312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.262342 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.262362 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.364833 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.364899 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.364910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.364951 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.364970 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.392209 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 15:31:14.940068406 +0000 UTC Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.446263 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.446322 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:08 crc kubenswrapper[4773]: E0120 18:31:08.446430 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.446455 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:08 crc kubenswrapper[4773]: E0120 18:31:08.446600 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:08 crc kubenswrapper[4773]: E0120 18:31:08.446730 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.468082 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.468155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.468165 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.468183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.468193 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.571176 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.571210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.571219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.571237 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.571249 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.674213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.674267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.674283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.674301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.674315 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.777503 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.777553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.777563 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.777580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.777591 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.880427 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.880492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.880505 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.880529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.880547 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.983921 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.983979 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.983990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.984005 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:08 crc kubenswrapper[4773]: I0120 18:31:08.984016 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:08Z","lastTransitionTime":"2026-01-20T18:31:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.087475 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.087506 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.087515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.087528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.087537 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.191189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.191261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.191276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.191300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.191318 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.295100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.295185 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.295245 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.295267 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.295282 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.392677 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 05:40:33.781169559 +0000 UTC Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.398226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.398265 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.398274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.398291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.398303 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.448578 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:09 crc kubenswrapper[4773]: E0120 18:31:09.449023 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.500474 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.500511 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.500519 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.500531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.500539 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.602998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.603029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.603036 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.603049 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.603058 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.706691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.706744 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.706753 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.706769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.706778 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.809856 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.809978 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.810000 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.810031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.810053 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.914202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.914273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.914296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.914324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:09 crc kubenswrapper[4773]: I0120 18:31:09.914347 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:09Z","lastTransitionTime":"2026-01-20T18:31:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.017780 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.017846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.017867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.017895 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.017914 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.120696 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.120774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.120821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.120853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.120897 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.224090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.224139 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.224167 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.224184 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.224196 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.327545 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.327576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.327585 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.327600 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.327610 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.393127 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 16:48:34.421596952 +0000 UTC Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.429443 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.429501 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.429513 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.429529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.429541 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.446651 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.446692 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.446830 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:10 crc kubenswrapper[4773]: E0120 18:31:10.446885 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:10 crc kubenswrapper[4773]: E0120 18:31:10.446978 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:10 crc kubenswrapper[4773]: E0120 18:31:10.447186 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.532752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.532832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.532850 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.532880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.532904 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.635896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.636216 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.636258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.636293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.636316 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.739119 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.739165 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.739176 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.739192 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.739202 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.841543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.841584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.841593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.841609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.841618 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.944001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.944063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.944076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.944095 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:10 crc kubenswrapper[4773]: I0120 18:31:10.944109 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:10Z","lastTransitionTime":"2026-01-20T18:31:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.046986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.047040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.047052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.047076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.047090 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.149449 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.149489 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.149498 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.149516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.149527 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.252421 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.252483 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.252494 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.252519 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.252533 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.356387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.356441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.356452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.356471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.356489 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.393394 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-09 19:43:31.002488579 +0000 UTC Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.446364 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:11 crc kubenswrapper[4773]: E0120 18:31:11.446527 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.458685 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.458754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.458777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.458807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.458827 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.561973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.562024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.562034 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.562051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.562062 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.666242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.666284 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.666293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.666309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.666319 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.769153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.769224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.769234 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.769258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.769271 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.872377 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.872435 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.872443 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.872459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.872468 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.975231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.975299 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.975312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.975335 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:11 crc kubenswrapper[4773]: I0120 18:31:11.975352 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:11Z","lastTransitionTime":"2026-01-20T18:31:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.077852 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.077891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.077901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.077915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.077942 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.180949 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.181017 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.181029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.181052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.181067 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.291157 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.291204 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.291215 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.291230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.291243 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.393184 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.393232 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.393242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.393257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.393266 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.393638 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 14:32:01.117762962 +0000 UTC Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.446715 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.446737 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.446836 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:12 crc kubenswrapper[4773]: E0120 18:31:12.447051 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:12 crc kubenswrapper[4773]: E0120 18:31:12.447192 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:12 crc kubenswrapper[4773]: E0120 18:31:12.447234 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.495878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.495920 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.495951 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.495968 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.495979 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.598666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.598703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.598714 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.598730 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.598741 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.700556 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.700596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.700606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.700621 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.700631 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.803343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.803386 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.803396 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.803413 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.803423 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.860036 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:12 crc kubenswrapper[4773]: E0120 18:31:12.860265 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:31:12 crc kubenswrapper[4773]: E0120 18:31:12.860385 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:31:44.860359299 +0000 UTC m=+97.782172513 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.905958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.906005 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.906043 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.906063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:12 crc kubenswrapper[4773]: I0120 18:31:12.906073 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:12Z","lastTransitionTime":"2026-01-20T18:31:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.008221 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.008262 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.008275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.008290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.008302 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.111662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.111721 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.111732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.111753 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.111769 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.214502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.214553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.214562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.214577 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.214586 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.316747 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.316791 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.316803 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.316824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.316838 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.393819 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 16:19:47.643263138 +0000 UTC Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.420001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.420064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.420074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.420093 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.420105 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.446543 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:13 crc kubenswrapper[4773]: E0120 18:31:13.446801 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.523088 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.523137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.523147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.523164 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.523176 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.625917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.626006 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.626021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.626043 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.626059 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.728799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.728855 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.728869 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.728892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.728909 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.832178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.832278 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.832307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.832343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.832368 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.927553 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/0.log" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.927615 4773 generic.go:334] "Generic (PLEG): container finished" podID="061a607e-1868-4fcf-b3ea-d51157511d41" containerID="5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5" exitCode=1 Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.927653 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerDied","Data":"5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.928248 4773 scope.go:117] "RemoveContainer" containerID="5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.934735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.934769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.934781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.934798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.934808 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:13Z","lastTransitionTime":"2026-01-20T18:31:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.943668 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:13Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.959891 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:13Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.970892 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:13Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:13 crc kubenswrapper[4773]: I0120 18:31:13.992051 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:13Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.004326 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.006632 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.006714 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.006728 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.006772 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.006787 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.020525 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.020576 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.025331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.025396 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.025407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.025428 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.025441 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.039281 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.043209 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.048427 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.048453 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.048464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.048482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.048494 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.055435 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.068299 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.070739 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.073410 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.073454 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.073464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.073481 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.073492 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.082920 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.085605 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.090128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.090192 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.090209 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.090231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.090244 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.098200 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.103147 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.103282 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.105187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.105217 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.105225 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.105239 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.105249 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.111410 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.127115 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.151686 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.168593 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.187582 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.208064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.208103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.208116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.208136 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.208147 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.210264 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.229735 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.311573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.311622 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.311636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.311660 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.311677 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.395022 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 00:02:51.442982845 +0000 UTC Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.414168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.414220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.414233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.414255 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.414267 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.446920 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.447080 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.446973 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.447283 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.447494 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:14 crc kubenswrapper[4773]: E0120 18:31:14.447633 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.517612 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.517649 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.517657 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.517672 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.517683 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.620560 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.620607 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.620617 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.620639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.620652 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.725123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.725197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.725223 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.725259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.725283 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.833019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.833099 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.833134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.833166 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.833188 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.935870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.935924 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.935960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.935981 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.935991 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:14Z","lastTransitionTime":"2026-01-20T18:31:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.936023 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/0.log" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.936106 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerStarted","Data":"dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7"} Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.957074 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.972285 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:14 crc kubenswrapper[4773]: I0120 18:31:14.984613 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:14Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.013311 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.026241 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.040014 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.040648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.040678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.040688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.040703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.040714 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.053712 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.072876 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.085583 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.095996 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.106243 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.118835 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.130098 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.142857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.142893 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.142903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.142922 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.142953 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.151339 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.165193 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.175967 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.189578 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.201133 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:15Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.244987 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.245057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.245071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.245086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.245097 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.347089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.347150 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.347162 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.347180 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.347222 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.395612 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 15:56:55.586753958 +0000 UTC Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.447122 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:15 crc kubenswrapper[4773]: E0120 18:31:15.447255 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.448778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.448814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.448822 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.448836 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.448846 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.550648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.550754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.550779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.550802 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.550818 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.652775 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.652806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.652816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.652827 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.652836 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.754712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.754752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.754760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.754777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.754788 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.856652 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.856869 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.856877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.856892 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.856902 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.959339 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.959368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.959376 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.959388 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:15 crc kubenswrapper[4773]: I0120 18:31:15.959397 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:15Z","lastTransitionTime":"2026-01-20T18:31:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.061980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.062023 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.062034 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.062048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.062059 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.164061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.164219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.164243 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.164281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.164311 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.266821 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.266867 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.266881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.266899 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.266911 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.369974 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.370015 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.370026 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.370042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.370053 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.395826 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 05:32:33.366681482 +0000 UTC Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.446110 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.446161 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.446171 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:16 crc kubenswrapper[4773]: E0120 18:31:16.446276 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:16 crc kubenswrapper[4773]: E0120 18:31:16.446377 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:16 crc kubenswrapper[4773]: E0120 18:31:16.446487 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.473761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.473801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.473810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.473826 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.473836 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.575787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.575826 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.575838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.575853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.575863 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.678213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.678257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.678271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.678290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.678304 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.781035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.781075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.781086 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.781102 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.781114 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.883534 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.883590 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.883604 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.883625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:16 crc kubenswrapper[4773]: I0120 18:31:16.883637 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:16Z","lastTransitionTime":"2026-01-20T18:31:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.059811 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.059853 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.059866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.059879 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.059891 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.162490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.162530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.162540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.162555 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.162566 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.264558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.264603 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.264615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.264630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.264640 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.367487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.367572 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.367585 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.367601 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.367615 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.396963 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 10:23:26.058207323 +0000 UTC Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.446661 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:17 crc kubenswrapper[4773]: E0120 18:31:17.447140 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.447416 4773 scope.go:117] "RemoveContainer" containerID="9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.459727 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.469606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.469628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.469636 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.469648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.469656 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.476023 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.490853 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.507247 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.522623 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.534645 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.553090 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.571838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.571870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.571880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.571895 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.571905 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.580542 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.596251 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.611279 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.626708 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.645095 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.673430 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.673646 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.673719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.673832 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.673632 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.673956 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.713367 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.731496 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.748075 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.766880 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.776817 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.776851 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.776861 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.776880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.776891 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.777490 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:17Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.879824 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.879898 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.879914 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.879959 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.879979 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.983405 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.983450 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.983459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.983476 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:17 crc kubenswrapper[4773]: I0120 18:31:17.983486 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:17Z","lastTransitionTime":"2026-01-20T18:31:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.058542 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/2.log" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.061165 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.061569 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.079214 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.086725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.086761 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.086770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.086784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.086793 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.092448 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.108532 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.121085 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.131296 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.142258 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.163222 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.182742 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.189310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.189347 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.189357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.189371 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.189383 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.196363 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.209184 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.264656 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.278026 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.289976 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.291504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.291540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.291550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.291568 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.291580 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.305455 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.318837 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.338116 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.352788 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.365786 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:18Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.394043 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.394096 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.394108 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.394130 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.394145 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.397342 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-08 15:26:34.603319382 +0000 UTC Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.446869 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.446966 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:18 crc kubenswrapper[4773]: E0120 18:31:18.447158 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.447192 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:18 crc kubenswrapper[4773]: E0120 18:31:18.447350 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:18 crc kubenswrapper[4773]: E0120 18:31:18.447429 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.497163 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.497236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.497247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.497265 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.497284 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.599093 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.599135 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.599145 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.599159 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.599168 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.701698 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.701747 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.701758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.701779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.701790 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.804642 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.804674 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.804683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.804696 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.804707 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.907274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.907336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.907354 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.907378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:18 crc kubenswrapper[4773]: I0120 18:31:18.907395 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:18Z","lastTransitionTime":"2026-01-20T18:31:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.010056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.010092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.010103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.010117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.010126 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.066149 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/3.log" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.067128 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/2.log" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.069350 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" exitCode=1 Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.069383 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.069416 4773 scope.go:117] "RemoveContainer" containerID="9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.070260 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:31:19 crc kubenswrapper[4773]: E0120 18:31:19.070461 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.085198 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.099448 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.113213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.113296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.113322 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.113353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.113388 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.115829 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.129232 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.140044 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.149040 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.160501 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.171784 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.191258 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.203772 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.215986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.216007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.216016 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.216027 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.216036 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.217218 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.232346 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.248027 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.261363 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.273503 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.285461 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.302993 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9358a6bba6c6f6bb98b1a268d22d96575da32a1a73003a6097a7bf1a9db92c0d\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:30:51Z\\\",\\\"message\\\":\\\"mns:[] Mutations:[{Column:policies Mutator:insert Value:{GoSet:[{GoUUID:a5a72d02-1a0f-4f7f-a8c5-6923a1c4274a}]}}] Timeout:\\\\u003cnil\\\\u003e Where:[where column _uuid == {f6d604c1-9711-4e25-be6c-79ec28bbad1b}] Until: Durable:\\\\u003cnil\\\\u003e Comment:\\\\u003cnil\\\\u003e Lock:\\\\u003cnil\\\\u003e UUID: UUIDName:}]\\\\nI0120 18:30:51.352647 6450 address_set.go:302] New(0d39bc5c-d5b9-432c-81be-2275bce5d7aa/default-network-controller:EgressIP:node-ips:v4:default/a712973235162149816) with []\\\\nI0120 18:30:51.352764 6450 address_set.go:302] New(aa6fc2dc-fab0-4812-b9da-809058e4dcf7/default-network-controller:EgressIP:egressip-served-pods:v4:default/a8519615025667110816) with []\\\\nI0120 18:30:51.352852 6450 address_set.go:302] New(bf133528-8652-4c84-85ff-881f0afe9837/default-network-controller:EgressService:egresssvc-served-pods:v4/a13607449821398607916) with []\\\\nI0120 18:30:51.353085 6450 factory.go:1336] Added *v1.Node event handler 7\\\\nI0120 18:30:51.353196 6450 factory.go:1336] Added *v1.EgressIP event handler 8\\\\nI0120 18:30:51.353707 6450 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0120 18:30:51.353849 6450 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0120 18:30:51.353898 6450 ovnkube.go:599] Stopped ovnkube\\\\nI0120 18:30:51.353933 6450 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0120 18:30:51.354047 6450 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:50Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:18Z\\\",\\\"message\\\":\\\"ndler 3 for removal\\\\nI0120 18:31:18.345159 6816 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0120 18:31:18.345185 6816 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0120 18:31:18.345292 6816 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0120 18:31:18.345383 6816 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:31:18.345395 6816 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:31:18.345451 6816 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0120 18:31:18.345468 6816 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:31:18.345473 6816 factory.go:656] Stopping watch factory\\\\nI0120 18:31:18.345493 6816 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0120 18:31:18.345526 6816 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 18:31:18.345537 6816 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0120 18:31:18.345544 6816 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:31:18.345550 6816 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0120 18:31:18.345560 6816 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:31:18.345576 6816 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:31:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.315428 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:19Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.317778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.317808 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.317819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.317835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.317845 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.398181 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 20:12:28.906482044 +0000 UTC Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.420456 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.420504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.420523 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.420550 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.420569 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.446659 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:19 crc kubenswrapper[4773]: E0120 18:31:19.446862 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.522779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.522833 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.522844 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.522862 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.522874 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.625160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.625225 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.625259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.625284 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.625302 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.728418 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.728466 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.728488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.728515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.728535 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.831334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.831391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.831412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.831436 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.831452 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.934139 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.934188 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.934202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.934222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:19 crc kubenswrapper[4773]: I0120 18:31:19.934235 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:19Z","lastTransitionTime":"2026-01-20T18:31:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.037428 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.037705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.037815 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.037881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.037962 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.074546 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/3.log" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.078866 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:31:20 crc kubenswrapper[4773]: E0120 18:31:20.079110 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.093907 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.107133 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.118884 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.140284 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:18Z\\\",\\\"message\\\":\\\"ndler 3 for removal\\\\nI0120 18:31:18.345159 6816 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0120 18:31:18.345185 6816 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0120 18:31:18.345292 6816 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0120 18:31:18.345383 6816 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:31:18.345395 6816 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:31:18.345451 6816 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0120 18:31:18.345468 6816 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:31:18.345473 6816 factory.go:656] Stopping watch factory\\\\nI0120 18:31:18.345493 6816 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0120 18:31:18.345526 6816 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 18:31:18.345537 6816 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0120 18:31:18.345544 6816 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:31:18.345550 6816 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0120 18:31:18.345560 6816 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:31:18.345576 6816 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:31:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.141249 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.141330 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.141349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.141380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.141402 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.155162 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.167390 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.184269 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.197616 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.216457 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.229182 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.243761 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.244706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.244757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.244775 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.244803 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.244821 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.261465 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.282610 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.295200 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.309871 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.327649 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.346332 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.347781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.347816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.347829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.347849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.347863 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.358566 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:20Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.399295 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 13:00:35.385821228 +0000 UTC Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.446812 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.446893 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:20 crc kubenswrapper[4773]: E0120 18:31:20.447014 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:20 crc kubenswrapper[4773]: E0120 18:31:20.447170 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.446816 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:20 crc kubenswrapper[4773]: E0120 18:31:20.447357 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.450619 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.450642 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.450651 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.450665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.450676 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.552835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.552871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.552882 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.552899 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.552910 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.655513 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.655553 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.655561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.655574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.655584 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.758112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.758147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.758157 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.758169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.758179 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.860321 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.860358 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.860369 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.860386 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.860400 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.962777 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.962839 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.962856 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.962877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:20 crc kubenswrapper[4773]: I0120 18:31:20.962891 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:20Z","lastTransitionTime":"2026-01-20T18:31:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.065593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.065666 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.065689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.065718 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.065739 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.168337 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.168371 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.168379 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.168395 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.168405 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.270610 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.270671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.270690 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.270713 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.270730 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.373789 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.374045 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.374063 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.374081 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.374096 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.400250 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 10:15:42.826491996 +0000 UTC Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.446314 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:21 crc kubenswrapper[4773]: E0120 18:31:21.446453 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.476026 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.476094 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.476114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.476137 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.476184 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.578655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.578717 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.578736 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.578760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.578778 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.681816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.681871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.681888 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.681910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.681926 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.785021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.785055 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.785064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.785077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.785086 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.888219 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.888283 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.888302 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.888329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.888354 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.990742 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.990798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.990815 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.990838 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:21 crc kubenswrapper[4773]: I0120 18:31:21.990854 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:21Z","lastTransitionTime":"2026-01-20T18:31:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.093051 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.093132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.093151 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.093174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.093191 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.195993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.196046 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.196062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.196126 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.196144 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.298587 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.298620 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.298631 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.298645 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.298656 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.400358 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 21:36:05.448689653 +0000 UTC Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.401894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.401963 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.401976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.401994 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.402005 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.447083 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.447238 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.447353 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:22 crc kubenswrapper[4773]: E0120 18:31:22.447262 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:22 crc kubenswrapper[4773]: E0120 18:31:22.447479 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:22 crc kubenswrapper[4773]: E0120 18:31:22.447532 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.504974 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.505029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.505042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.505060 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.505072 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.608168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.608242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.608261 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.608285 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.608302 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.711469 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.711540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.711559 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.711585 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.711603 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.815145 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.815194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.815205 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.815222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.815232 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.918702 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.918754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.918766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.918785 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:22 crc kubenswrapper[4773]: I0120 18:31:22.918796 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:22Z","lastTransitionTime":"2026-01-20T18:31:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.022626 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.022702 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.022719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.022762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.022800 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.125403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.125464 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.125481 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.125502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.125518 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.228606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.228669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.228680 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.228705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.228720 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.331706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.331748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.331758 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.331772 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.331782 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.400743 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 01:09:36.77589273 +0000 UTC Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.433752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.433801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.433812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.433829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.433840 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.446616 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:23 crc kubenswrapper[4773]: E0120 18:31:23.446828 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.537067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.537121 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.537132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.537151 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.537163 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.639756 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.639818 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.639836 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.639858 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.639875 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.742520 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.742551 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.742559 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.742572 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.742580 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.845325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.845353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.845361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.845373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.845381 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.947854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.947883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.947891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.947903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:23 crc kubenswrapper[4773]: I0120 18:31:23.947912 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:23Z","lastTransitionTime":"2026-01-20T18:31:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.050828 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.050861 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.050870 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.050886 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.050895 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.153814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.153864 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.153877 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.153894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.153907 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.256708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.256767 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.256788 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.256812 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.256833 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.359748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.359816 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.359835 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.359862 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.359883 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.401212 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 14:56:22.267479221 +0000 UTC Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.446271 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.446297 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.446303 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.446502 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.446627 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.446760 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.454628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.454655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.454664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.454676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.454685 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.476900 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:24Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.481196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.481275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.481311 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.481329 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.481340 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.500023 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:24Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.505901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.505993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.506013 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.506037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.506055 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.526052 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:24Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.535734 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.535778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.535849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.535894 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.535915 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.558192 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:24Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.562991 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.563057 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.563074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.563394 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.563430 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.581588 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:24Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:24 crc kubenswrapper[4773]: E0120 18:31:24.581707 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.583967 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.584001 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.584011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.584024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.584034 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.687588 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.687651 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.687662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.687678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.687690 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.790256 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.790336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.790353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.790381 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.790398 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.893269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.893306 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.893317 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.893333 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.893345 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.996256 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.996291 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.996301 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.996318 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:24 crc kubenswrapper[4773]: I0120 18:31:24.996329 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:24Z","lastTransitionTime":"2026-01-20T18:31:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.098441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.098500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.098511 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.098530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.098845 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.201375 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.201427 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.201444 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.201471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.201491 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.303807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.303871 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.303889 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.303913 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.303963 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.401844 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 12:48:57.815322028 +0000 UTC Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.408701 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.409561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.409725 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.409874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.410030 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.446768 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:25 crc kubenswrapper[4773]: E0120 18:31:25.446991 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.513128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.513181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.513197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.513222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.513240 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.615965 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.616269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.616452 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.616629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.616826 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.720451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.720896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.721130 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.721320 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.721502 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.824328 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.824363 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.824372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.824386 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.824394 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.927305 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.927380 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.927397 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.927418 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:25 crc kubenswrapper[4773]: I0120 18:31:25.927434 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:25Z","lastTransitionTime":"2026-01-20T18:31:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.029300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.029343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.029356 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.029374 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.029388 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.131547 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.131813 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.131966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.132062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.132161 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.235355 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.235814 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.236072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.236295 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.236477 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.339144 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.339487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.339573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.339664 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.339752 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.402053 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 18:46:55.989217178 +0000 UTC Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.442529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.442778 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.442849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.442916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.443001 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.447079 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.447079 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:26 crc kubenswrapper[4773]: E0120 18:31:26.447338 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.447173 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:26 crc kubenswrapper[4773]: E0120 18:31:26.447618 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:26 crc kubenswrapper[4773]: E0120 18:31:26.447501 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.545748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.546036 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.546113 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.546182 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.546253 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.648606 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.648648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.648662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.648678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.648690 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.750662 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.750695 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.750703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.750716 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.750724 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.853456 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.853500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.853511 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.853528 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.853539 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.956609 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.956676 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.956719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.956745 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:26 crc kubenswrapper[4773]: I0120 18:31:26.956762 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:26Z","lastTransitionTime":"2026-01-20T18:31:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.059647 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.059696 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.059714 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.059738 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.059758 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.162849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.162914 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.162960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.162985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.163003 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.265487 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.265522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.265531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.265543 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.265554 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.367478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.367706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.367771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.367834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.367894 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.402257 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 16:03:06.719857221 +0000 UTC Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.447064 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:27 crc kubenswrapper[4773]: E0120 18:31:27.447171 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.467556 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"bfcadb32-181d-4e84-8078-53323164fc49\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://72535f55bb65c332cd4acb4ad0bf8b94a28a2167da3bba993de6ffb1d6dca1cf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff759d4098d334dfada2f6a14e8a5d363019e0da19879b5dca11ff48b6273b24\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://29039ffb258d5056b02a3ef4f48c732b8573d9437d032a225b59f0ba95527796\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.471113 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.471309 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.471765 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.472038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.472215 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.487848 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.511095 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.531096 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.546687 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.562192 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.576781 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.576849 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.576873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.576973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.577003 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.580463 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d7821f5e-4734-489f-bcf9-910b875a4848\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f76276c73de548ee85dd07ca424e1f6df2c7da35e7e57880473fc1175983fb37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f830c4eb2fd242a6e650833aeafd5935d16e005b2faa950f6507f656293ca458\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcp99\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-gbn6k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.598007 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"76ff7d92-1dd2-45e7-9abb-7dd442f7b958\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b665d0df790466d0543796901ca4d72cfb93cbaa3c6f751cd7283280636e2a5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9b8695b8d17ea9a4d8e29018b9be5f70748b76e671e9ce50ce1e4f100f5e370f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://344d1e5f1fbcdf841c83e49f3932e95f086a05522139238c2743cf27c78bb77e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cadc50138b9abc5f9f038da3a5ba2f689019c713027f9cd589b28cee9d3dcfe4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.617585 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.630250 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.642691 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://443153c40f503abca43f8d62276e852dfc67a4ad50beb817bcc4e7c9f1893d4e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.659711 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7ddd5104-3112-413e-b908-2b7f336b41f1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:33Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://52a5c1f8407893fc5542044f2a612cb402c4529818efa2a76e03ecc09dafd07d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb38f7b2fdfdc0b646bc9108bfe5c99f013b349cb8a53c3fce31a3fc2644c745\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1db0e55e8d032975ea136940d064d7e5e5929b4f24d2691cb2165aecbb3fd71e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7fc32eadde36b1bbfa2f0fb9562f2c1abe025a91f4e90d0d3a8e5c2a12a4e53f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8b981d37047ff22da68b2d9a33c0c6a599e028482b80a9e9b0b45c14d77a2cb2\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://53e0dfb1a04051b30484f98897d86ff5adb9ef680437d01e43cd37b4a2404bce\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a42063e8c09492c5670e90e5ced6405536f594cb5ae490cdacb00320e735d157\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:32Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q7r5p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-kjbfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.674272 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-bccxn" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"061a607e-1868-4fcf-b3ea-d51157511d41\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:13Z\\\",\\\"message\\\":\\\"2026-01-20T18:30:28+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437\\\\n2026-01-20T18:30:28+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_7123b345-28cb-43bd-ab31-fbc8539d0437 to /host/opt/cni/bin/\\\\n2026-01-20T18:30:28Z [verbose] multus-daemon started\\\\n2026-01-20T18:30:28Z [verbose] Readiness Indicator file check\\\\n2026-01-20T18:31:13Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:31:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-mwtw7\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-multus\"/\"multus-bccxn\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.678859 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.678887 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.678896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.678909 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.678919 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.688887 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de86ad0ce5469489a5f3eac4940913be3a897c3ec5f48526c7396b21bca92fda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.701539 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://965d3f61a7d29b4a851d13ea906f7d4d1809f89d314cd0e9b08ba897108625f9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://48a81307df70a49c4cf901ede80a07616e64babebf760b7fb3f51fc71743812a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.711754 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1ddd934f-f012-4083-b5e6-b99711071621\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63daf0eeb8fcb1c4b47e205c8ea4ca5071153c573cce6d1f4638e0d44a96d47e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-64lq8\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-sq4x7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.732858 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f354424d-7f22-42d6-8bd9-00e32e78c3d3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:29Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-20T18:31:18Z\\\",\\\"message\\\":\\\"ndler 3 for removal\\\\nI0120 18:31:18.345159 6816 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0120 18:31:18.345185 6816 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0120 18:31:18.345292 6816 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0120 18:31:18.345383 6816 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0120 18:31:18.345395 6816 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0120 18:31:18.345451 6816 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0120 18:31:18.345468 6816 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0120 18:31:18.345473 6816 factory.go:656] Stopping watch factory\\\\nI0120 18:31:18.345493 6816 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0120 18:31:18.345526 6816 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0120 18:31:18.345537 6816 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0120 18:31:18.345544 6816 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0120 18:31:18.345550 6816 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0120 18:31:18.345560 6816 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0120 18:31:18.345576 6816 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:31:17Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-9flh4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:27Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-qt89w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.746408 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3791c4b7-dcef-470d-a67e-a2c0bb004436\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:40Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-66lgr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:40Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-4jpbd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:27Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.782081 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.782116 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.782125 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.782138 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.782146 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.885150 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.885420 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.885431 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.885447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.885456 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.989706 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.989764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.989784 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.989806 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:27 crc kubenswrapper[4773]: I0120 18:31:27.989823 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:27Z","lastTransitionTime":"2026-01-20T18:31:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.093331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.093376 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.093385 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.093400 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.093409 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.195702 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.195764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.195782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.195834 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.195853 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.298157 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.298252 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.298270 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.298296 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.298312 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.401719 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.401787 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.401804 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.401830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.401849 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.402589 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 16:51:16.050657711 +0000 UTC Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.446075 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.446120 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.446205 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:28 crc kubenswrapper[4773]: E0120 18:31:28.446270 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:28 crc kubenswrapper[4773]: E0120 18:31:28.446413 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:28 crc kubenswrapper[4773]: E0120 18:31:28.446576 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.505077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.505143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.505160 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.505181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.505199 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.609201 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.609272 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.609290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.609313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.609331 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.712317 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.712362 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.712372 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.712387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.712398 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.816128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.816178 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.816189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.816206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.816218 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.918873 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.918971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.918999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.919029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:28 crc kubenswrapper[4773]: I0120 18:31:28.919049 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:28Z","lastTransitionTime":"2026-01-20T18:31:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.022566 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.022643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.022663 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.022688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.022707 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.124918 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.125019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.125038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.125064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.125082 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.227412 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.227476 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.227499 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.227530 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.227553 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.331382 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.331476 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.331500 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.331575 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.331600 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.403567 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 00:51:28.440835814 +0000 UTC Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.435025 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.435061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.435069 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.435083 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.435092 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.446697 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:29 crc kubenswrapper[4773]: E0120 18:31:29.446860 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.538269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.538313 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.538325 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.538341 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.538353 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.642011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.642072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.642092 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.642131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.642149 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.744857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.744955 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.744974 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.744999 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.745017 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.848274 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.848336 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.848353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.848379 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.848402 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.951967 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.952019 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.952035 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.952052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:29 crc kubenswrapper[4773]: I0120 18:31:29.952064 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:29Z","lastTransitionTime":"2026-01-20T18:31:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.056266 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.056314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.056340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.056364 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.056378 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.159770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.159850 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.159874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.159900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.159917 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.204610 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.204874 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.204834553 +0000 UTC m=+147.126647707 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.204999 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.205172 4773 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.205252 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.205231112 +0000 UTC m=+147.127044306 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.263022 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.263091 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.263107 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.263126 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.263148 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.305920 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.306002 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.306042 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306130 4773 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306142 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306177 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306191 4773 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306193 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.306178188 +0000 UTC m=+147.227991222 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306246 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.306230899 +0000 UTC m=+147.228043913 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306381 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306457 4773 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306485 4773 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.306596 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.306565618 +0000 UTC m=+147.228378822 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.365799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.365857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.365874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.365896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.365914 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.404470 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 15:04:03.115125548 +0000 UTC Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.446377 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.446491 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.446524 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.446683 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.446683 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:30 crc kubenswrapper[4773]: E0120 18:31:30.446733 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.457719 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.468555 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.468625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.468643 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.468668 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.468689 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.571344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.571373 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.571383 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.571399 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.571410 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.674121 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.674159 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.674169 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.674183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.674194 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.777349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.777434 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.777459 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.777486 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.777508 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.880419 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.880480 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.880504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.880526 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.880538 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.982925 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.983031 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.983053 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.983077 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:30 crc kubenswrapper[4773]: I0120 18:31:30.983093 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:30Z","lastTransitionTime":"2026-01-20T18:31:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.085745 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.085798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.085815 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.085837 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.085853 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.189062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.189097 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.189108 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.189123 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.189136 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.291721 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.292248 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.292593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.293010 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.293177 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.396436 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.396478 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.396488 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.396502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.396512 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.405362 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 23:15:48.729436583 +0000 UTC Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.447294 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:31 crc kubenswrapper[4773]: E0120 18:31:31.447527 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.500117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.500545 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.500712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.500840 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.501035 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.603631 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.603670 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.603683 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.603697 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.603707 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.706639 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.706708 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.706731 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.706799 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.706822 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.808564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.808796 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.809028 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.809257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.809459 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.912965 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.913003 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.913012 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.913024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:31 crc kubenswrapper[4773]: I0120 18:31:31.913034 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:31Z","lastTransitionTime":"2026-01-20T18:31:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.015966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.016044 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.016062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.016084 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.016099 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.118013 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.118097 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.118117 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.118138 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.118155 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.221884 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.221983 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.222009 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.222038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.222060 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.325895 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.325989 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.326007 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.326072 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.326102 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.407064 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-18 22:25:40.487737774 +0000 UTC Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.428846 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.428891 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.428907 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.428957 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.428975 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.446851 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.446991 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:32 crc kubenswrapper[4773]: E0120 18:31:32.447065 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:32 crc kubenswrapper[4773]: E0120 18:31:32.447159 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.446881 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:32 crc kubenswrapper[4773]: E0120 18:31:32.447315 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.530960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.530990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.530998 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.531011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.531019 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.633564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.633592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.633602 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.633615 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.633623 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.736110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.736145 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.736154 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.736166 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.736175 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.838900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.838962 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.838974 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.838990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.839004 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.941271 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.941339 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.941349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.941362 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:32 crc kubenswrapper[4773]: I0120 18:31:32.941373 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:32Z","lastTransitionTime":"2026-01-20T18:31:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.043988 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.044029 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.044040 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.044054 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.044063 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.146087 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.146122 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.146131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.146143 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.146151 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.248293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.248332 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.248340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.248353 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.248361 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.351193 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.351247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.351258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.351276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.351314 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.408500 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-22 20:13:10.556703074 +0000 UTC Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.447144 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:33 crc kubenswrapper[4773]: E0120 18:31:33.447358 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.448373 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:31:33 crc kubenswrapper[4773]: E0120 18:31:33.448644 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.453628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.453667 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.453678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.453694 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.453706 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.556596 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.556696 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.556729 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.556760 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.556800 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.660062 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.660140 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.660161 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.660189 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.660207 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.763174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.763250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.763273 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.763297 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.763316 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.866323 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.866393 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.866416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.866441 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.866459 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.969445 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.969497 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.969516 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.969540 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:33 crc kubenswrapper[4773]: I0120 18:31:33.969559 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:33Z","lastTransitionTime":"2026-01-20T18:31:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.072236 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.072416 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.072432 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.072451 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.072740 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.175880 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.175917 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.175946 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.175960 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.175969 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.277854 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.277910 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.277919 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.277970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.277981 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.380630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.380691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.380709 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.380735 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.380792 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.409393 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 00:34:21.519649853 +0000 UTC Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.446872 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.446889 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.446908 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:34 crc kubenswrapper[4773]: E0120 18:31:34.447240 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:34 crc kubenswrapper[4773]: E0120 18:31:34.447398 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:34 crc kubenswrapper[4773]: E0120 18:31:34.447573 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.483649 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.483734 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.483746 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.483762 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.483809 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.586658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.586723 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.586745 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.586773 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.586795 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.689993 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.690732 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.690757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.690782 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.690800 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.793686 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.793774 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.793798 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.793831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.793854 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.896700 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.896754 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.896771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.896793 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.896809 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.964011 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.964067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.964076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.964089 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.964097 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:34 crc kubenswrapper[4773]: E0120 18:31:34.983589 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:34Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.987801 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.987874 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.987899 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.987967 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:34 crc kubenswrapper[4773]: I0120 18:31:34.987993 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:34Z","lastTransitionTime":"2026-01-20T18:31:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: E0120 18:31:35.011153 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:34Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.016678 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.016739 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.016749 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.016764 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.016774 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: E0120 18:31:35.036014 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.040008 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.040048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.040059 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.040076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.040088 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: E0120 18:31:35.053837 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.057486 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.057531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.057544 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.057561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.057573 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: E0120 18:31:35.070583 4773 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-20T18:31:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"0ac96020-64a6-43b4-8bf4-975de5898510\\\",\\\"systemUUID\\\":\\\"3435f284-a40d-4f32-a1fa-55cd3339f30e\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:35Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:35 crc kubenswrapper[4773]: E0120 18:31:35.070818 4773 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.073064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.073126 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.073146 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.073170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.073189 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.176170 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.176275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.176294 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.176331 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.176353 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.279202 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.279287 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.279307 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.279340 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.279358 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.382268 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.382349 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.382368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.382400 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.382422 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.409862 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 13:09:50.125099781 +0000 UTC Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.446694 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:35 crc kubenswrapper[4773]: E0120 18:31:35.447060 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.485620 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.485698 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.485718 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.485747 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.485764 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.589206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.589300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.589317 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.589344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.589362 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.692625 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.692693 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.692705 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.692726 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.692737 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.796090 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.796210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.796242 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.796281 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.796310 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.900206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.900269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.900288 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.900312 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:35 crc kubenswrapper[4773]: I0120 18:31:35.900330 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:35Z","lastTransitionTime":"2026-01-20T18:31:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.004204 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.004293 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.004314 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.004344 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.004362 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.108630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.108741 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.108766 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.108819 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.108850 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.211346 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.211398 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.211407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.211421 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.211432 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.315142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.315229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.315258 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.315290 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.315317 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.410485 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-17 13:08:26.042562535 +0000 UTC Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.418076 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.418147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.418168 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.418194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.418212 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.446636 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.446699 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:36 crc kubenswrapper[4773]: E0120 18:31:36.446807 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.446642 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:36 crc kubenswrapper[4773]: E0120 18:31:36.446997 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:36 crc kubenswrapper[4773]: E0120 18:31:36.447244 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.523979 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.524042 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.524059 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.524085 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.524105 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.627269 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.627311 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.627324 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.627343 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.627360 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.730514 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.730567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.730578 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.730593 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.730603 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.832757 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.832800 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.832809 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.832823 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.832833 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.935133 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.935183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.935194 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.935213 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:36 crc kubenswrapper[4773]: I0120 18:31:36.935225 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:36Z","lastTransitionTime":"2026-01-20T18:31:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.038142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.038174 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.038186 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.038200 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.038212 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.140890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.140920 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.140954 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.140976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.140986 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.243793 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.243860 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.243875 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.243899 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.243914 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.347064 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.347122 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.347134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.347153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.347166 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.450202 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 00:02:40.284262618 +0000 UTC Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.451092 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:37 crc kubenswrapper[4773]: E0120 18:31:37.454094 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.456096 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.456171 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.456196 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.456231 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.456266 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.470870 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-gczfj" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"357ca347-8fa9-4f0b-9f49-a540f14e0198\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:28Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://48c821d7f174268680f0e1197457126c88076eded93bbcddb5a0d60b1fa9fc80\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-h4b9d\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:26Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-gczfj\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.486007 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-5sv79" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5a565d2f-43a1-41f5-b7a6-85d7d0aea0a7\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4eb26301ab8154925399887204867f95a36003c308ac49adb8a2867ae29ac39\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-dls8p\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:29Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-5sv79\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.506764 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c9b329af-a6e2-4ba8-b70d-f1ad0cd67671\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"ing back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": net/http: TLS handshake timeout\\\\nI0120 18:30:20.840178 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0120 18:30:20.842359 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-1141381009/tls.crt::/tmp/serving-cert-1141381009/tls.key\\\\\\\"\\\\nI0120 18:30:26.279650 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0120 18:30:26.285211 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0120 18:30:26.285271 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0120 18:30:26.285327 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0120 18:30:26.285335 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0120 18:30:26.291960 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0120 18:30:26.291983 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291990 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0120 18:30:26.291998 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0120 18:30:26.292002 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0120 18:30:26.292006 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0120 18:30:26.292010 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0120 18:30:26.292142 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0120 18:30:26.296450 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.525691 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.558740 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ee8e82f1-ac2b-4e32-ace8-c3e6edfb55b4\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:11Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:29Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://17efc7a160f4cb934a3927c68e65b9f1387ed13881b8c9ca676d30c0d241ace0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b89a8e0fd81a1ccb6675b46b9a7425fec5158a53cc7e4314068fa737d0becda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f63cd12ee6d399e1a6f77a55f05a532a2de3b4cd05a5e10a93f521364473a403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d78e43bc7b09a1599f98e9e72fd0f49db94102e952a391f687d50abed277ceda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:12Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://de08872865ebda9c9a4e1642d9c4d4f6c709908332ffbd091f992fad7722eca7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-20T18:30:11Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d4b2f386c626b23e9827d272b70d5cbc8eda557cd20a9b24ff06f4427ed30db0\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:08Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://039eefae2e150be606599672cfb85a21e1650b5ee4a0563ddcdc1fecb17f61ae\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:09Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:09Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://68aab352d9a7201bab5e21a95181b8818ad5385dbe706ff144dbe86cff986bc4\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-20T18:30:10Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-20T18:30:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-20T18:30:07Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.559470 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.559529 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.559564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.559583 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.559595 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.579724 4773 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-20T18:30:26Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-20T18:31:37Z is after 2025-08-24T17:21:41Z" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.639733 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-kjbfj" podStartSLOduration=71.639709856 podStartE2EDuration="1m11.639709856s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.639232095 +0000 UTC m=+90.561045149" watchObservedRunningTime="2026-01-20 18:31:37.639709856 +0000 UTC m=+90.561522910" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.663561 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.663629 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.663653 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.663684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.663708 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.667451 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bccxn" podStartSLOduration=71.66742618399999 podStartE2EDuration="1m11.667426184s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.666084252 +0000 UTC m=+90.587897376" watchObservedRunningTime="2026-01-20 18:31:37.667426184 +0000 UTC m=+90.589239238" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.703373 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=38.703319086 podStartE2EDuration="38.703319086s" podCreationTimestamp="2026-01-20 18:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.701757709 +0000 UTC m=+90.623570743" watchObservedRunningTime="2026-01-20 18:31:37.703319086 +0000 UTC m=+90.625132120" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.704230 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-gbn6k" podStartSLOduration=70.704222448 podStartE2EDuration="1m10.704222448s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.684217033 +0000 UTC m=+90.606030067" watchObservedRunningTime="2026-01-20 18:31:37.704222448 +0000 UTC m=+90.626035482" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.714167 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=7.714148013 podStartE2EDuration="7.714148013s" podCreationTimestamp="2026-01-20 18:31:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.713375015 +0000 UTC m=+90.635188049" watchObservedRunningTime="2026-01-20 18:31:37.714148013 +0000 UTC m=+90.635961047" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.759703 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podStartSLOduration=71.759685865 podStartE2EDuration="1m11.759685865s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.731748121 +0000 UTC m=+90.653561155" watchObservedRunningTime="2026-01-20 18:31:37.759685865 +0000 UTC m=+90.681498909" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.766448 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.766494 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.766509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.766531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.766546 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.820474 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=70.820452317 podStartE2EDuration="1m10.820452317s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:37.819225948 +0000 UTC m=+90.741038972" watchObservedRunningTime="2026-01-20 18:31:37.820452317 +0000 UTC m=+90.742265341" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.868584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.868628 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.868638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.868674 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.868684 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.971482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.971878 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.972052 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.972187 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:37 crc kubenswrapper[4773]: I0120 18:31:37.972310 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:37Z","lastTransitionTime":"2026-01-20T18:31:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.075729 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.075790 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.075807 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.075831 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.075850 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.178447 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.178492 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.178504 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.178522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.178536 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.281323 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.281573 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.281655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.281736 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.281798 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.384711 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.384752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.384763 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.384779 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.384791 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.446785 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.446865 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:38 crc kubenswrapper[4773]: E0120 18:31:38.446906 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:38 crc kubenswrapper[4773]: E0120 18:31:38.447119 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.447216 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:38 crc kubenswrapper[4773]: E0120 18:31:38.447416 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.450368 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-26 11:55:46.2000213 +0000 UTC Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.488037 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.488096 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.488110 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.488134 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.488150 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.591252 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.591287 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.591300 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.591315 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.591327 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.693579 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.693624 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.693648 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.693669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.693682 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.796522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.796562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.796574 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.796589 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.796599 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.899958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.900056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.900078 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.900103 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:38 crc kubenswrapper[4773]: I0120 18:31:38.900120 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:38Z","lastTransitionTime":"2026-01-20T18:31:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.003111 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.003190 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.003220 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.003247 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.003267 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.107075 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.107128 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.107142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.107161 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.107172 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.209875 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.209950 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.209966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.209982 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.209996 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.312780 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.312868 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.312883 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.312902 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.312916 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.416097 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.416163 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.416181 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.416207 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.416226 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.446767 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:39 crc kubenswrapper[4773]: E0120 18:31:39.447966 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.450664 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 03:11:18.270772508 +0000 UTC Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.519493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.519558 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.519575 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.519601 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.519620 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.622068 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.622133 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.622153 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.622179 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.622197 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.724830 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.724895 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.724916 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.724977 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.725001 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.827990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.828056 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.828074 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.828097 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.828114 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.930915 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.931197 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.931222 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.931246 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:39 crc kubenswrapper[4773]: I0120 18:31:39.931263 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:39Z","lastTransitionTime":"2026-01-20T18:31:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.035229 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.035303 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.035326 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.035357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.035377 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.139067 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.139109 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.139122 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.139142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.139156 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.242276 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.242352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.242378 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.242407 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.242428 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.345391 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.345493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.345510 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.345534 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.345551 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.447174 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.447244 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.447181 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:40 crc kubenswrapper[4773]: E0120 18:31:40.447497 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:40 crc kubenswrapper[4773]: E0120 18:31:40.447653 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:40 crc kubenswrapper[4773]: E0120 18:31:40.448054 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.449155 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.449195 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.449209 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.449226 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.449240 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.451591 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 03:11:23.241689854 +0000 UTC Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.551810 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.551881 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.551903 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.551926 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.551980 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.655048 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.655098 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.655112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.655129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.655141 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.757311 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.757350 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.757361 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.757377 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.757390 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.860403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.860458 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.860474 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.860493 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.860520 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.964442 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.964515 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.964538 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.964565 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:40 crc kubenswrapper[4773]: I0120 18:31:40.964583 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:40Z","lastTransitionTime":"2026-01-20T18:31:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.067352 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.067398 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.067409 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.067423 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.067434 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.170475 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.170564 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.170581 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.170634 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.170650 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.273171 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.273259 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.273275 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.273298 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.273314 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.376855 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.376996 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.377024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.377058 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.377081 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.447043 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:41 crc kubenswrapper[4773]: E0120 18:31:41.447250 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.452011 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 15:42:55.997824451 +0000 UTC Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.480396 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.480461 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.480472 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.480490 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.480500 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.583857 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.583965 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.583994 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.584024 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.584045 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.687534 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.687752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.687795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.687826 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.687848 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.790900 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.790973 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.790986 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.791004 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.791017 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.894127 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.894210 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.894233 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.894257 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.894274 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.997522 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.997562 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.997570 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.997584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:41 crc kubenswrapper[4773]: I0120 18:31:41.997593 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:41Z","lastTransitionTime":"2026-01-20T18:31:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.100567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.100627 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.100647 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.100671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.100688 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.203044 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.203100 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.203114 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.203131 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.203142 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.306310 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.306403 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.306430 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.306465 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.306485 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.410021 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.410106 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.410129 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.410161 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.410184 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.446975 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.447097 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:42 crc kubenswrapper[4773]: E0120 18:31:42.447247 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.447335 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:42 crc kubenswrapper[4773]: E0120 18:31:42.447531 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:42 crc kubenswrapper[4773]: E0120 18:31:42.447768 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.452322 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 16:04:45.255468762 +0000 UTC Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.513509 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.513580 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.513598 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.513624 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.513645 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.615907 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.615959 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.615971 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.615984 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.615993 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.718404 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.718483 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.718502 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.718531 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.718549 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.822689 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.822752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.822770 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.822794 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.822813 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.925507 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.925618 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.925638 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.925665 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:42 crc kubenswrapper[4773]: I0120 18:31:42.925684 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:42Z","lastTransitionTime":"2026-01-20T18:31:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.028047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.028101 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.028120 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.028142 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.028158 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.131399 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.131584 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.131611 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.131637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.131670 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.234896 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.234966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.234980 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.234997 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.235009 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.338425 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.338471 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.338482 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.338527 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.338540 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.442747 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.442829 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.442850 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.442876 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.442893 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.447192 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:43 crc kubenswrapper[4773]: E0120 18:31:43.447410 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.452479 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-05 18:49:05.329682131 +0000 UTC Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.546976 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.547047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.547061 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.547085 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.547100 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.650576 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.650637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.650655 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.650681 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.650700 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.753674 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.753737 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.753752 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.753771 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.753784 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.857592 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.857650 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.857667 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.857691 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.857712 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.959890 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.959958 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.959970 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.959990 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:43 crc kubenswrapper[4773]: I0120 18:31:43.960003 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:43Z","lastTransitionTime":"2026-01-20T18:31:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.062614 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.062658 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.062671 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.062688 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.062700 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.164669 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.164712 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.164722 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.164737 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.164748 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.268230 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.268334 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.268357 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.268387 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.268406 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.372368 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.372440 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.372457 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.372486 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.372506 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.447189 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:44 crc kubenswrapper[4773]: E0120 18:31:44.447433 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.447773 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:44 crc kubenswrapper[4773]: E0120 18:31:44.447894 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.448169 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:44 crc kubenswrapper[4773]: E0120 18:31:44.448280 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.453491 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 19:21:16.739862731 +0000 UTC Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.475966 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.476030 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.476047 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.476071 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.476090 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.578876 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.579369 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.579533 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.579679 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.579815 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.682183 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.682224 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.682234 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.682250 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.682262 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.784567 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.784616 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.784630 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.784649 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.784667 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.868598 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:44 crc kubenswrapper[4773]: E0120 18:31:44.868915 4773 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:31:44 crc kubenswrapper[4773]: E0120 18:31:44.869082 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs podName:3791c4b7-dcef-470d-a67e-a2c0bb004436 nodeName:}" failed. No retries permitted until 2026-01-20 18:32:48.869050448 +0000 UTC m=+161.790863512 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs") pod "network-metrics-daemon-4jpbd" (UID: "3791c4b7-dcef-470d-a67e-a2c0bb004436") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.888038 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.888112 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.888132 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.888158 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.888178 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.991637 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.991703 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.991721 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.991745 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:44 crc kubenswrapper[4773]: I0120 18:31:44.991765 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:44Z","lastTransitionTime":"2026-01-20T18:31:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.094748 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.094839 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.094866 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.094901 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.094975 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:45Z","lastTransitionTime":"2026-01-20T18:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.197985 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.198093 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.198113 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.198140 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.198158 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:45Z","lastTransitionTime":"2026-01-20T18:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.301684 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.301769 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.301795 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.301825 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.301848 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:45Z","lastTransitionTime":"2026-01-20T18:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.340065 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.340147 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.340171 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.340206 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.340231 4773 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-20T18:31:45Z","lastTransitionTime":"2026-01-20T18:31:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.399141 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc"] Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.399492 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.402575 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.402985 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.403240 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.403439 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.436161 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=79.436142342 podStartE2EDuration="1m19.436142342s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:45.435959428 +0000 UTC m=+98.357772452" watchObservedRunningTime="2026-01-20 18:31:45.436142342 +0000 UTC m=+98.357955376" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.446656 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:45 crc kubenswrapper[4773]: E0120 18:31:45.446788 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.453658 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-18 17:37:23.611407843 +0000 UTC Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.453711 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.459665 4773 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.462700 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-gczfj" podStartSLOduration=79.462686612 podStartE2EDuration="1m19.462686612s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:45.462250922 +0000 UTC m=+98.384063946" watchObservedRunningTime="2026-01-20 18:31:45.462686612 +0000 UTC m=+98.384499636" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.473952 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5sv79" podStartSLOduration=79.473907789 podStartE2EDuration="1m19.473907789s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:45.473059129 +0000 UTC m=+98.394872143" watchObservedRunningTime="2026-01-20 18:31:45.473907789 +0000 UTC m=+98.395720823" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.476545 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25126370-c138-4fa2-af29-896492cb6a1c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.476596 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/25126370-c138-4fa2-af29-896492cb6a1c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.476633 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/25126370-c138-4fa2-af29-896492cb6a1c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.476664 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25126370-c138-4fa2-af29-896492cb6a1c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.476682 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/25126370-c138-4fa2-af29-896492cb6a1c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.497556 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=76.49753396 podStartE2EDuration="1m16.49753396s" podCreationTimestamp="2026-01-20 18:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:45.495511792 +0000 UTC m=+98.417324816" watchObservedRunningTime="2026-01-20 18:31:45.49753396 +0000 UTC m=+98.419347004" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578110 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25126370-c138-4fa2-af29-896492cb6a1c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578212 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/25126370-c138-4fa2-af29-896492cb6a1c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578288 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/25126370-c138-4fa2-af29-896492cb6a1c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578346 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25126370-c138-4fa2-af29-896492cb6a1c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578383 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/25126370-c138-4fa2-af29-896492cb6a1c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578447 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/25126370-c138-4fa2-af29-896492cb6a1c-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.578494 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/25126370-c138-4fa2-af29-896492cb6a1c-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.579926 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/25126370-c138-4fa2-af29-896492cb6a1c-service-ca\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.590366 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25126370-c138-4fa2-af29-896492cb6a1c-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.609503 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/25126370-c138-4fa2-af29-896492cb6a1c-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-d9pzc\" (UID: \"25126370-c138-4fa2-af29-896492cb6a1c\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: I0120 18:31:45.722404 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" Jan 20 18:31:45 crc kubenswrapper[4773]: W0120 18:31:45.747306 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25126370_c138_4fa2_af29_896492cb6a1c.slice/crio-6bfda720e6a595b582b541f7e1ba7824f12f3238e68a7facdb00e2b9cf006583 WatchSource:0}: Error finding container 6bfda720e6a595b582b541f7e1ba7824f12f3238e68a7facdb00e2b9cf006583: Status 404 returned error can't find the container with id 6bfda720e6a595b582b541f7e1ba7824f12f3238e68a7facdb00e2b9cf006583 Jan 20 18:31:46 crc kubenswrapper[4773]: I0120 18:31:46.161060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" event={"ID":"25126370-c138-4fa2-af29-896492cb6a1c","Type":"ContainerStarted","Data":"3c2764b6b0fd4f90ece02472679aeef92be73c90d7df4455940bfa1908f4245d"} Jan 20 18:31:46 crc kubenswrapper[4773]: I0120 18:31:46.161107 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" event={"ID":"25126370-c138-4fa2-af29-896492cb6a1c","Type":"ContainerStarted","Data":"6bfda720e6a595b582b541f7e1ba7824f12f3238e68a7facdb00e2b9cf006583"} Jan 20 18:31:46 crc kubenswrapper[4773]: I0120 18:31:46.178875 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-d9pzc" podStartSLOduration=80.178841016 podStartE2EDuration="1m20.178841016s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:31:46.177459813 +0000 UTC m=+99.099272837" watchObservedRunningTime="2026-01-20 18:31:46.178841016 +0000 UTC m=+99.100654060" Jan 20 18:31:46 crc kubenswrapper[4773]: I0120 18:31:46.446374 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:46 crc kubenswrapper[4773]: I0120 18:31:46.446436 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:46 crc kubenswrapper[4773]: I0120 18:31:46.446570 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:46 crc kubenswrapper[4773]: E0120 18:31:46.446665 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:46 crc kubenswrapper[4773]: E0120 18:31:46.446823 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:46 crc kubenswrapper[4773]: E0120 18:31:46.446889 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:47 crc kubenswrapper[4773]: I0120 18:31:47.446261 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:47 crc kubenswrapper[4773]: E0120 18:31:47.448351 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:47 crc kubenswrapper[4773]: I0120 18:31:47.449909 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:31:47 crc kubenswrapper[4773]: E0120 18:31:47.450418 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:31:48 crc kubenswrapper[4773]: I0120 18:31:48.446421 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:48 crc kubenswrapper[4773]: E0120 18:31:48.446737 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:48 crc kubenswrapper[4773]: I0120 18:31:48.446910 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:48 crc kubenswrapper[4773]: E0120 18:31:48.447170 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:48 crc kubenswrapper[4773]: I0120 18:31:48.447267 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:48 crc kubenswrapper[4773]: E0120 18:31:48.447515 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:49 crc kubenswrapper[4773]: I0120 18:31:49.446913 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:49 crc kubenswrapper[4773]: E0120 18:31:49.447154 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:50 crc kubenswrapper[4773]: I0120 18:31:50.446091 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:50 crc kubenswrapper[4773]: I0120 18:31:50.446092 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:50 crc kubenswrapper[4773]: I0120 18:31:50.446171 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:50 crc kubenswrapper[4773]: E0120 18:31:50.446898 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:50 crc kubenswrapper[4773]: E0120 18:31:50.447107 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:50 crc kubenswrapper[4773]: E0120 18:31:50.447106 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:51 crc kubenswrapper[4773]: I0120 18:31:51.446206 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:51 crc kubenswrapper[4773]: E0120 18:31:51.446475 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:52 crc kubenswrapper[4773]: I0120 18:31:52.447102 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:52 crc kubenswrapper[4773]: I0120 18:31:52.447102 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:52 crc kubenswrapper[4773]: E0120 18:31:52.447301 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:52 crc kubenswrapper[4773]: E0120 18:31:52.447466 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:52 crc kubenswrapper[4773]: I0120 18:31:52.447117 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:52 crc kubenswrapper[4773]: E0120 18:31:52.447778 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:53 crc kubenswrapper[4773]: I0120 18:31:53.446555 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:53 crc kubenswrapper[4773]: E0120 18:31:53.446840 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:54 crc kubenswrapper[4773]: I0120 18:31:54.446041 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:54 crc kubenswrapper[4773]: I0120 18:31:54.446186 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:54 crc kubenswrapper[4773]: E0120 18:31:54.446327 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:54 crc kubenswrapper[4773]: E0120 18:31:54.446322 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:54 crc kubenswrapper[4773]: I0120 18:31:54.446445 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:54 crc kubenswrapper[4773]: E0120 18:31:54.446775 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:55 crc kubenswrapper[4773]: I0120 18:31:55.446899 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:55 crc kubenswrapper[4773]: E0120 18:31:55.447062 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:56 crc kubenswrapper[4773]: I0120 18:31:56.447245 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:56 crc kubenswrapper[4773]: I0120 18:31:56.447297 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:56 crc kubenswrapper[4773]: I0120 18:31:56.447254 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:56 crc kubenswrapper[4773]: E0120 18:31:56.447479 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:56 crc kubenswrapper[4773]: E0120 18:31:56.447579 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:56 crc kubenswrapper[4773]: E0120 18:31:56.447684 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:57 crc kubenswrapper[4773]: I0120 18:31:57.446497 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:57 crc kubenswrapper[4773]: E0120 18:31:57.448520 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:31:58 crc kubenswrapper[4773]: I0120 18:31:58.446191 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:31:58 crc kubenswrapper[4773]: I0120 18:31:58.446192 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:31:58 crc kubenswrapper[4773]: I0120 18:31:58.446267 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:31:58 crc kubenswrapper[4773]: E0120 18:31:58.446955 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:31:58 crc kubenswrapper[4773]: E0120 18:31:58.447559 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:31:58 crc kubenswrapper[4773]: I0120 18:31:58.447808 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:31:58 crc kubenswrapper[4773]: E0120 18:31:58.447971 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-qt89w_openshift-ovn-kubernetes(f354424d-7f22-42d6-8bd9-00e32e78c3d3)\"" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" Jan 20 18:31:58 crc kubenswrapper[4773]: E0120 18:31:58.448075 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:31:59 crc kubenswrapper[4773]: I0120 18:31:59.446312 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:31:59 crc kubenswrapper[4773]: E0120 18:31:59.446480 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.207306 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/1.log" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.208106 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/0.log" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.208195 4773 generic.go:334] "Generic (PLEG): container finished" podID="061a607e-1868-4fcf-b3ea-d51157511d41" containerID="dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7" exitCode=1 Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.208246 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerDied","Data":"dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7"} Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.208305 4773 scope.go:117] "RemoveContainer" containerID="5980e4a9cdd846b335d853a6c7c4fa7c9862c37c4858418919eae03855d094f5" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.208724 4773 scope.go:117] "RemoveContainer" containerID="dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7" Jan 20 18:32:00 crc kubenswrapper[4773]: E0120 18:32:00.209009 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-bccxn_openshift-multus(061a607e-1868-4fcf-b3ea-d51157511d41)\"" pod="openshift-multus/multus-bccxn" podUID="061a607e-1868-4fcf-b3ea-d51157511d41" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.446706 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.446807 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:00 crc kubenswrapper[4773]: I0120 18:32:00.446885 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:00 crc kubenswrapper[4773]: E0120 18:32:00.446875 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:00 crc kubenswrapper[4773]: E0120 18:32:00.447051 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:00 crc kubenswrapper[4773]: E0120 18:32:00.447154 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:01 crc kubenswrapper[4773]: I0120 18:32:01.214625 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/1.log" Jan 20 18:32:01 crc kubenswrapper[4773]: I0120 18:32:01.447110 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:01 crc kubenswrapper[4773]: E0120 18:32:01.447247 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:02 crc kubenswrapper[4773]: I0120 18:32:02.446659 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:02 crc kubenswrapper[4773]: I0120 18:32:02.446692 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:02 crc kubenswrapper[4773]: I0120 18:32:02.447601 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:02 crc kubenswrapper[4773]: E0120 18:32:02.447800 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:02 crc kubenswrapper[4773]: E0120 18:32:02.448152 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:02 crc kubenswrapper[4773]: E0120 18:32:02.448232 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:03 crc kubenswrapper[4773]: I0120 18:32:03.447077 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:03 crc kubenswrapper[4773]: E0120 18:32:03.447240 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:04 crc kubenswrapper[4773]: I0120 18:32:04.446416 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:04 crc kubenswrapper[4773]: I0120 18:32:04.446489 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:04 crc kubenswrapper[4773]: I0120 18:32:04.446416 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:04 crc kubenswrapper[4773]: E0120 18:32:04.446578 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:04 crc kubenswrapper[4773]: E0120 18:32:04.446678 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:04 crc kubenswrapper[4773]: E0120 18:32:04.446801 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:05 crc kubenswrapper[4773]: I0120 18:32:05.446978 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:05 crc kubenswrapper[4773]: E0120 18:32:05.447108 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:06 crc kubenswrapper[4773]: I0120 18:32:06.446041 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:06 crc kubenswrapper[4773]: E0120 18:32:06.446440 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:06 crc kubenswrapper[4773]: I0120 18:32:06.446189 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:06 crc kubenswrapper[4773]: I0120 18:32:06.446159 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:06 crc kubenswrapper[4773]: E0120 18:32:06.446703 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:06 crc kubenswrapper[4773]: E0120 18:32:06.446949 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:07 crc kubenswrapper[4773]: E0120 18:32:07.388589 4773 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 20 18:32:07 crc kubenswrapper[4773]: I0120 18:32:07.446638 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:07 crc kubenswrapper[4773]: E0120 18:32:07.448616 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:07 crc kubenswrapper[4773]: E0120 18:32:07.657858 4773 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 20 18:32:08 crc kubenswrapper[4773]: I0120 18:32:08.446080 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:08 crc kubenswrapper[4773]: I0120 18:32:08.446154 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:08 crc kubenswrapper[4773]: I0120 18:32:08.446201 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:08 crc kubenswrapper[4773]: E0120 18:32:08.446818 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:08 crc kubenswrapper[4773]: E0120 18:32:08.447160 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:08 crc kubenswrapper[4773]: E0120 18:32:08.447279 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:09 crc kubenswrapper[4773]: I0120 18:32:09.454197 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:09 crc kubenswrapper[4773]: E0120 18:32:09.455373 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:10 crc kubenswrapper[4773]: I0120 18:32:10.446363 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:10 crc kubenswrapper[4773]: I0120 18:32:10.446458 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:10 crc kubenswrapper[4773]: E0120 18:32:10.446539 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:10 crc kubenswrapper[4773]: E0120 18:32:10.446688 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:10 crc kubenswrapper[4773]: I0120 18:32:10.446855 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:10 crc kubenswrapper[4773]: E0120 18:32:10.447127 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:11 crc kubenswrapper[4773]: I0120 18:32:11.446774 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:11 crc kubenswrapper[4773]: E0120 18:32:11.446971 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:12 crc kubenswrapper[4773]: I0120 18:32:12.447023 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:12 crc kubenswrapper[4773]: I0120 18:32:12.447066 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:12 crc kubenswrapper[4773]: I0120 18:32:12.447115 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:12 crc kubenswrapper[4773]: E0120 18:32:12.447210 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:12 crc kubenswrapper[4773]: E0120 18:32:12.447277 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:12 crc kubenswrapper[4773]: E0120 18:32:12.447377 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:12 crc kubenswrapper[4773]: E0120 18:32:12.660046 4773 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 20 18:32:13 crc kubenswrapper[4773]: I0120 18:32:13.446922 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:13 crc kubenswrapper[4773]: E0120 18:32:13.447079 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:13 crc kubenswrapper[4773]: I0120 18:32:13.447859 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.252633 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/3.log" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.254923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerStarted","Data":"5d2aab5769291bf8517a6be58643ec33bb4a9c92e32ef5c6a6be6258a94a21e0"} Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.255366 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.281314 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podStartSLOduration=108.281296109 podStartE2EDuration="1m48.281296109s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:14.279308361 +0000 UTC m=+127.201121385" watchObservedRunningTime="2026-01-20 18:32:14.281296109 +0000 UTC m=+127.203109133" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.446544 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.446608 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:14 crc kubenswrapper[4773]: E0120 18:32:14.446710 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.446620 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:14 crc kubenswrapper[4773]: E0120 18:32:14.446861 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:14 crc kubenswrapper[4773]: E0120 18:32:14.447091 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:14 crc kubenswrapper[4773]: I0120 18:32:14.468234 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4jpbd"] Jan 20 18:32:15 crc kubenswrapper[4773]: I0120 18:32:15.257911 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:15 crc kubenswrapper[4773]: E0120 18:32:15.258284 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:15 crc kubenswrapper[4773]: I0120 18:32:15.446619 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:15 crc kubenswrapper[4773]: E0120 18:32:15.447072 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:15 crc kubenswrapper[4773]: I0120 18:32:15.447281 4773 scope.go:117] "RemoveContainer" containerID="dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7" Jan 20 18:32:16 crc kubenswrapper[4773]: I0120 18:32:16.262362 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/1.log" Jan 20 18:32:16 crc kubenswrapper[4773]: I0120 18:32:16.263089 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerStarted","Data":"12757148bdfa862c997ae9700dd354a77024b7a40e5d5398f8af800d1a220e65"} Jan 20 18:32:16 crc kubenswrapper[4773]: I0120 18:32:16.446919 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:16 crc kubenswrapper[4773]: I0120 18:32:16.447097 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:16 crc kubenswrapper[4773]: E0120 18:32:16.447103 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-4jpbd" podUID="3791c4b7-dcef-470d-a67e-a2c0bb004436" Jan 20 18:32:16 crc kubenswrapper[4773]: I0120 18:32:16.447142 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:16 crc kubenswrapper[4773]: E0120 18:32:16.447228 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 20 18:32:16 crc kubenswrapper[4773]: E0120 18:32:16.447375 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 20 18:32:17 crc kubenswrapper[4773]: I0120 18:32:17.446201 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:17 crc kubenswrapper[4773]: E0120 18:32:17.447326 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.446370 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.446414 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.446515 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.450796 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.450887 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.451127 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.451167 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.451488 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 20 18:32:18 crc kubenswrapper[4773]: I0120 18:32:18.451404 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 20 18:32:19 crc kubenswrapper[4773]: I0120 18:32:19.446961 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:20 crc kubenswrapper[4773]: I0120 18:32:20.365125 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.825595 4773 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.875040 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7zslm"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.876007 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bhrll"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.876256 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.876726 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.876973 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bjtnp"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.877534 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.877826 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.878345 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.879689 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4jhgc"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.880045 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.881753 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.882037 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.882122 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.884257 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.884341 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.884802 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.885078 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.885102 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.885693 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.885210 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.885429 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.886427 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.887065 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.887495 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.887845 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.888085 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.888187 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.888240 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.888387 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.898762 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.897829 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.901919 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-9nh6h"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.909805 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.927169 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.927795 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.928071 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.928439 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.928814 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.929579 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931128 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/49deabd4-ebbe-4c07-bb79-105982db000a-node-pullsecrets\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931180 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-etcd-serving-ca\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-config\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931231 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-audit\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931250 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82jpq\" (UniqueName: \"kubernetes.io/projected/49deabd4-ebbe-4c07-bb79-105982db000a-kube-api-access-82jpq\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931271 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-client-ca\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931288 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-client-ca\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931303 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-serving-cert\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931320 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/330a8450-c400-425d-9a46-e868a02fca27-auth-proxy-config\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931338 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d49ef4e-91fb-4b98-89d9-65358c718967-serving-cert\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931358 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl4kb\" (UniqueName: \"kubernetes.io/projected/411d251b-6daa-4c45-9aeb-aa38def60a90-kube-api-access-sl4kb\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931377 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-config\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931391 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/419120bb-3f1b-4f21-adf5-ac057bd5dce6-serving-cert\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931409 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a99225b3-64c7-4b39-807c-c97faa919977-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931425 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49deabd4-ebbe-4c07-bb79-105982db000a-audit-dir\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931439 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/411d251b-6daa-4c45-9aeb-aa38def60a90-serving-cert\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931455 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330a8450-c400-425d-9a46-e868a02fca27-config\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931477 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-image-import-ca\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931477 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931653 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-service-ca-bundle\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931692 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2mg8\" (UniqueName: \"kubernetes.io/projected/419120bb-3f1b-4f21-adf5-ac057bd5dce6-kube-api-access-l2mg8\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931708 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a99225b3-64c7-4b39-807c-c97faa919977-config\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931723 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-config\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931739 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz48f\" (UniqueName: \"kubernetes.io/projected/1d49ef4e-91fb-4b98-89d9-65358c718967-kube-api-access-jz48f\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931754 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/330a8450-c400-425d-9a46-e868a02fca27-machine-approver-tls\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931769 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a99225b3-64c7-4b39-807c-c97faa919977-images\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931783 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931800 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931815 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bts76\" (UniqueName: \"kubernetes.io/projected/330a8450-c400-425d-9a46-e868a02fca27-kube-api-access-bts76\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931843 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-encryption-config\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931858 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-config\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931883 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbtnm\" (UniqueName: \"kubernetes.io/projected/a99225b3-64c7-4b39-807c-c97faa919977-kube-api-access-mbtnm\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.931902 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-etcd-client\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.933986 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.934553 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.935103 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.935647 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.935666 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.935740 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.937494 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.938105 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.938607 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xwc5v"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.939103 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.939191 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.939407 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.939488 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.941362 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.946355 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.946836 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.947118 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.950629 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.950867 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951045 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951192 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951295 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951403 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951538 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951735 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.951905 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.952712 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xpsls"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.953459 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.953626 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kr4zh"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.954001 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-bkbfc"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.954222 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.954591 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.955280 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.960150 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.960396 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.960669 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.960907 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.961462 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.961618 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.963046 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.963261 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.963809 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.964691 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.965138 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.965342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.965422 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.966949 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.967177 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.967422 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.967678 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.967824 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.967953 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968074 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968103 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968145 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968252 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968280 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968368 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968435 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968494 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.968598 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.987407 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.988259 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.989450 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.993146 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.993453 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.993653 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.994240 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.994305 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p"] Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.994394 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.994300 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 20 18:32:25 crc kubenswrapper[4773]: I0120 18:32:25.995575 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011258 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011304 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011503 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011598 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011683 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011760 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011798 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.011883 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.012509 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.012603 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.012835 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.012924 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.013175 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.013762 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.014004 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.019384 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.019996 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2wmrt"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.020359 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.020404 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.024530 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.025188 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.027336 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.030367 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-x95ml"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.031029 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pfxs9"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.031624 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.032128 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.033213 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.034759 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.035085 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.036400 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.041823 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bhrll"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.038447 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.039165 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.038272 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d63bfb8-8ecc-43f3-8931-cc09c815c580-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.042995 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-serving-cert\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043088 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-service-ca-bundle\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vqxj\" (UniqueName: \"kubernetes.io/projected/79162e32-ee8c-4fcc-8911-0f95d41cd110-kube-api-access-9vqxj\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043243 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5ft\" (UniqueName: \"kubernetes.io/projected/cf25ec9b-96c5-4129-958f-35acbc34a20d-kube-api-access-4p5ft\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043319 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bae1b17-1679-4be9-9717-66c5a80ad425-serving-cert\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043414 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2mg8\" (UniqueName: \"kubernetes.io/projected/419120bb-3f1b-4f21-adf5-ac057bd5dce6-kube-api-access-l2mg8\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntrkt\" (UniqueName: \"kubernetes.io/projected/e98bf97b-784a-4a99-8eff-20e6fc687876-kube-api-access-ntrkt\") pod \"migrator-59844c95c7-c989h\" (UID: \"e98bf97b-784a-4a99-8eff-20e6fc687876\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043573 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a99225b3-64c7-4b39-807c-c97faa919977-config\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043646 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-config\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043720 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jz48f\" (UniqueName: \"kubernetes.io/projected/1d49ef4e-91fb-4b98-89d9-65358c718967-kube-api-access-jz48f\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043792 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/330a8450-c400-425d-9a46-e868a02fca27-machine-approver-tls\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.043955 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a99225b3-64c7-4b39-807c-c97faa919977-images\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044035 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044113 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044192 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044286 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbce412e-616a-465b-bb42-da842edb8110-serving-cert\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044359 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsp4s\" (UniqueName: \"kubernetes.io/projected/6bae1b17-1679-4be9-9717-66c5a80ad425-kube-api-access-dsp4s\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044436 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bts76\" (UniqueName: \"kubernetes.io/projected/330a8450-c400-425d-9a46-e868a02fca27-kube-api-access-bts76\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044519 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044593 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044662 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-config\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044729 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fd2de1-85c4-4f01-8524-7b93c777592d-config\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044816 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dzn2\" (UniqueName: \"kubernetes.io/projected/47548d0b-9447-4862-b717-9427ae40c49a-kube-api-access-5dzn2\") pod \"dns-operator-744455d44c-pfxs9\" (UID: \"47548d0b-9447-4862-b717-9427ae40c49a\") " pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.044890 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-encryption-config\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.046978 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-config\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.047112 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-etcd-client\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.047202 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.046380 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x6fwb"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.042010 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.047404 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.047519 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.047608 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.040823 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.048598 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.048765 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.048858 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.048775 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.048880 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a99225b3-64c7-4b39-807c-c97faa919977-images\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.047216 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbtnm\" (UniqueName: \"kubernetes.io/projected/a99225b3-64c7-4b39-807c-c97faa919977-kube-api-access-mbtnm\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.048989 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-etcd-client\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049016 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049037 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzp5t\" (UniqueName: \"kubernetes.io/projected/65b41ab6-6253-4ee5-87f2-50ed05610e03-kube-api-access-hzp5t\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049073 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049091 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4jhgc"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049092 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/65b41ab6-6253-4ee5-87f2-50ed05610e03-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049138 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-oauth-serving-cert\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049167 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/49deabd4-ebbe-4c07-bb79-105982db000a-node-pullsecrets\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049177 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049191 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmmk6\" (UniqueName: \"kubernetes.io/projected/c5d6a7d8-1840-4f2c-9fee-694a671f28cd-kube-api-access-dmmk6\") pod \"cluster-samples-operator-665b6dd947-l82hk\" (UID: \"c5d6a7d8-1840-4f2c-9fee-694a671f28cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049215 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/275484db-b3bc-4027-a1d7-a67ab3c71439-serving-cert\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049241 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-etcd-serving-ca\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049261 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-encryption-config\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049317 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049345 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-config\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049373 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79162e32-ee8c-4fcc-8911-0f95d41cd110-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049377 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/49deabd4-ebbe-4c07-bb79-105982db000a-node-pullsecrets\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049392 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-dir\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049394 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-service-ca-bundle\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049430 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049557 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pq64n\" (UniqueName: \"kubernetes.io/projected/bbce412e-616a-465b-bb42-da842edb8110-kube-api-access-pq64n\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049614 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bae1b17-1679-4be9-9717-66c5a80ad425-config\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049638 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6bae1b17-1679-4be9-9717-66c5a80ad425-trusted-ca\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049674 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-service-ca\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049694 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fd2de1-85c4-4f01-8524-7b93c777592d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049717 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47548d0b-9447-4862-b717-9427ae40c49a-metrics-tls\") pod \"dns-operator-744455d44c-pfxs9\" (UID: \"47548d0b-9447-4862-b717-9427ae40c49a\") " pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049765 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ddbv\" (UniqueName: \"kubernetes.io/projected/ba3736bb-3d36-4a0c-91fa-85f410849312-kube-api-access-8ddbv\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049803 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-audit\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049845 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82jpq\" (UniqueName: \"kubernetes.io/projected/49deabd4-ebbe-4c07-bb79-105982db000a-kube-api-access-82jpq\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049871 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-serving-cert\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049883 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-etcd-serving-ca\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049911 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzjhr\" (UniqueName: \"kubernetes.io/projected/fec9cba4-b7cb-46ca-90a4-af0d5114fee8-kube-api-access-pzjhr\") pod \"downloads-7954f5f757-bkbfc\" (UID: \"fec9cba4-b7cb-46ca-90a4-af0d5114fee8\") " pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049958 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65b41ab6-6253-4ee5-87f2-50ed05610e03-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.049990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-client-ca\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050025 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050061 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-console-config\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050095 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-client-ca\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050117 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-serving-cert\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050143 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/330a8450-c400-425d-9a46-e868a02fca27-auth-proxy-config\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050163 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5d6a7d8-1840-4f2c-9fee-694a671f28cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l82hk\" (UID: \"c5d6a7d8-1840-4f2c-9fee-694a671f28cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050182 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63fd2de1-85c4-4f01-8524-7b93c777592d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050205 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbce412e-616a-465b-bb42-da842edb8110-etcd-client\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050225 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/275484db-b3bc-4027-a1d7-a67ab3c71439-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050245 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79162e32-ee8c-4fcc-8911-0f95d41cd110-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050268 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050284 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-audit-policies\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d49ef4e-91fb-4b98-89d9-65358c718967-serving-cert\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050327 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rqrr\" (UniqueName: \"kubernetes.io/projected/275484db-b3bc-4027-a1d7-a67ab3c71439-kube-api-access-4rqrr\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050345 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-config\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050365 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7pps\" (UniqueName: \"kubernetes.io/projected/9d63bfb8-8ecc-43f3-8931-cc09c815c580-kube-api-access-k7pps\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050389 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sl4kb\" (UniqueName: \"kubernetes.io/projected/411d251b-6daa-4c45-9aeb-aa38def60a90-kube-api-access-sl4kb\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050405 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050421 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050438 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050455 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3570207-5cb9-4481-a15a-d0bb9312a84b-audit-dir\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050475 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-config\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050493 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/419120bb-3f1b-4f21-adf5-ac057bd5dce6-serving-cert\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-etcd-ca\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050530 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-oauth-config\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050548 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-trusted-ca-bundle\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050567 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d63bfb8-8ecc-43f3-8931-cc09c815c580-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050584 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrpqv\" (UniqueName: \"kubernetes.io/projected/b3570207-5cb9-4481-a15a-d0bb9312a84b-kube-api-access-xrpqv\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050754 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-etcd-service-ca\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050811 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a99225b3-64c7-4b39-807c-c97faa919977-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050839 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49deabd4-ebbe-4c07-bb79-105982db000a-audit-dir\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050880 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/411d251b-6daa-4c45-9aeb-aa38def60a90-serving-cert\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050903 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65b41ab6-6253-4ee5-87f2-50ed05610e03-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050956 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330a8450-c400-425d-9a46-e868a02fca27-config\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.050990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-image-import-ca\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.051012 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.051034 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-policies\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.051055 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.051086 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-client-ca\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.051177 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.051206 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-config\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.052242 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-encryption-config\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.052441 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/411d251b-6daa-4c45-9aeb-aa38def60a90-config\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.052807 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-client-ca\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.053032 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/49deabd4-ebbe-4c07-bb79-105982db000a-audit-dir\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.053652 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.054059 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-serving-cert\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.054151 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-image-import-ca\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055082 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-audit\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055149 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-config\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055396 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/411d251b-6daa-4c45-9aeb-aa38def60a90-serving-cert\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055782 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49deabd4-ebbe-4c07-bb79-105982db000a-trusted-ca-bundle\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055889 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a99225b3-64c7-4b39-807c-c97faa919977-config\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055915 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/419120bb-3f1b-4f21-adf5-ac057bd5dce6-serving-cert\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.055946 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/330a8450-c400-425d-9a46-e868a02fca27-config\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.056496 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-config\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.056992 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/330a8450-c400-425d-9a46-e868a02fca27-auth-proxy-config\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.057107 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/330a8450-c400-425d-9a46-e868a02fca27-machine-approver-tls\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.057471 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a99225b3-64c7-4b39-807c-c97faa919977-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.058578 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.058860 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d49ef4e-91fb-4b98-89d9-65358c718967-serving-cert\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.060967 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ff9dd"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.063306 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.064738 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.067904 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.069616 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.069897 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.073329 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.074268 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7zslm"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.074380 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.075951 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.078208 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.078494 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.078502 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.080261 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.088830 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.089481 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/49deabd4-ebbe-4c07-bb79-105982db000a-etcd-client\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.090047 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.092018 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.093681 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.094003 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.094328 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.096371 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.098116 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.098829 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.099170 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.099486 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.103430 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-cv7zc"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.103873 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.107163 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-llx65"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.107846 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.109620 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ks6ps"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.110504 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.111034 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bjtnp"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.112464 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w8vpz"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.113811 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.113924 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.114598 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.118549 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.118599 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xpsls"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.118622 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2wmrt"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.119407 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.120586 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.121655 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.122775 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.123835 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pfxs9"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.124970 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bkbfc"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.126306 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xwc5v"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.127319 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.128353 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9nh6h"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.129411 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kr4zh"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.130451 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.131435 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.132440 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.133420 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5t8h7"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.134622 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7j8nw"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.134747 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.135342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.135632 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.138109 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.139054 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.139186 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.140159 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.141200 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.142206 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.143201 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.144219 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ff9dd"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.145250 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x6fwb"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.146258 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ks6ps"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.147256 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.148334 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-llx65"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.149398 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w8vpz"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.150646 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.151530 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152493 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-etcd-client\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152571 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152607 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzp5t\" (UniqueName: \"kubernetes.io/projected/65b41ab6-6253-4ee5-87f2-50ed05610e03-kube-api-access-hzp5t\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152647 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152669 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/65b41ab6-6253-4ee5-87f2-50ed05610e03-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152700 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-oauth-serving-cert\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152791 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmmk6\" (UniqueName: \"kubernetes.io/projected/c5d6a7d8-1840-4f2c-9fee-694a671f28cd-kube-api-access-dmmk6\") pod \"cluster-samples-operator-665b6dd947-l82hk\" (UID: \"c5d6a7d8-1840-4f2c-9fee-694a671f28cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152822 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/275484db-b3bc-4027-a1d7-a67ab3c71439-serving-cert\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152851 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-encryption-config\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152899 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152950 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79162e32-ee8c-4fcc-8911-0f95d41cd110-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.152976 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-dir\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153001 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153038 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pq64n\" (UniqueName: \"kubernetes.io/projected/bbce412e-616a-465b-bb42-da842edb8110-kube-api-access-pq64n\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153057 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bae1b17-1679-4be9-9717-66c5a80ad425-config\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153075 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6bae1b17-1679-4be9-9717-66c5a80ad425-trusted-ca\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153095 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-service-ca\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153113 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fd2de1-85c4-4f01-8524-7b93c777592d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153136 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-dir\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153139 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47548d0b-9447-4862-b717-9427ae40c49a-metrics-tls\") pod \"dns-operator-744455d44c-pfxs9\" (UID: \"47548d0b-9447-4862-b717-9427ae40c49a\") " pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153212 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ddbv\" (UniqueName: \"kubernetes.io/projected/ba3736bb-3d36-4a0c-91fa-85f410849312-kube-api-access-8ddbv\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153249 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-serving-cert\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153271 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pzjhr\" (UniqueName: \"kubernetes.io/projected/fec9cba4-b7cb-46ca-90a4-af0d5114fee8-kube-api-access-pzjhr\") pod \"downloads-7954f5f757-bkbfc\" (UID: \"fec9cba4-b7cb-46ca-90a4-af0d5114fee8\") " pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153292 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65b41ab6-6253-4ee5-87f2-50ed05610e03-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153320 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153343 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-console-config\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153371 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5d6a7d8-1840-4f2c-9fee-694a671f28cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l82hk\" (UID: \"c5d6a7d8-1840-4f2c-9fee-694a671f28cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153392 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63fd2de1-85c4-4f01-8524-7b93c777592d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153411 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbce412e-616a-465b-bb42-da842edb8110-etcd-client\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153430 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/275484db-b3bc-4027-a1d7-a67ab3c71439-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153449 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79162e32-ee8c-4fcc-8911-0f95d41cd110-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153471 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153488 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-audit-policies\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153513 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rqrr\" (UniqueName: \"kubernetes.io/projected/275484db-b3bc-4027-a1d7-a67ab3c71439-kube-api-access-4rqrr\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153532 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-config\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7pps\" (UniqueName: \"kubernetes.io/projected/9d63bfb8-8ecc-43f3-8931-cc09c815c580-kube-api-access-k7pps\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153602 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153623 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153640 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3570207-5cb9-4481-a15a-d0bb9312a84b-audit-dir\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153662 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-etcd-ca\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153688 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-oauth-config\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153950 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.153703 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-trusted-ca-bundle\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154210 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d63bfb8-8ecc-43f3-8931-cc09c815c580-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154238 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrpqv\" (UniqueName: \"kubernetes.io/projected/b3570207-5cb9-4481-a15a-d0bb9312a84b-kube-api-access-xrpqv\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154279 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-etcd-service-ca\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65b41ab6-6253-4ee5-87f2-50ed05610e03-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154335 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-policies\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154376 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d63bfb8-8ecc-43f3-8931-cc09c815c580-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154393 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-serving-cert\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154448 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vqxj\" (UniqueName: \"kubernetes.io/projected/79162e32-ee8c-4fcc-8911-0f95d41cd110-kube-api-access-9vqxj\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154469 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5ft\" (UniqueName: \"kubernetes.io/projected/cf25ec9b-96c5-4129-958f-35acbc34a20d-kube-api-access-4p5ft\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154492 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bae1b17-1679-4be9-9717-66c5a80ad425-serving-cert\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154547 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntrkt\" (UniqueName: \"kubernetes.io/projected/e98bf97b-784a-4a99-8eff-20e6fc687876-kube-api-access-ntrkt\") pod \"migrator-59844c95c7-c989h\" (UID: \"e98bf97b-784a-4a99-8eff-20e6fc687876\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154554 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154586 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154643 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbce412e-616a-465b-bb42-da842edb8110-serving-cert\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154669 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsp4s\" (UniqueName: \"kubernetes.io/projected/6bae1b17-1679-4be9-9717-66c5a80ad425-kube-api-access-dsp4s\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154703 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154728 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154751 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-config\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154771 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fd2de1-85c4-4f01-8524-7b93c777592d-config\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.154807 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dzn2\" (UniqueName: \"kubernetes.io/projected/47548d0b-9447-4862-b717-9427ae40c49a-kube-api-access-5dzn2\") pod \"dns-operator-744455d44c-pfxs9\" (UID: \"47548d0b-9447-4862-b717-9427ae40c49a\") " pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.156712 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-oauth-serving-cert\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.156684 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-service-ca\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.156887 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d63bfb8-8ecc-43f3-8931-cc09c815c580-config\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.157057 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.157373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/6bae1b17-1679-4be9-9717-66c5a80ad425-trusted-ca\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.157862 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6bae1b17-1679-4be9-9717-66c5a80ad425-config\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.158443 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.159305 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-config\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.159579 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/65b41ab6-6253-4ee5-87f2-50ed05610e03-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.160469 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.160650 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/79162e32-ee8c-4fcc-8911-0f95d41cd110-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.161023 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.161109 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-serving-cert\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.161525 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.161759 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-console-config\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.162178 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/275484db-b3bc-4027-a1d7-a67ab3c71439-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.162761 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b3570207-5cb9-4481-a15a-d0bb9312a84b-audit-policies\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.163142 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b3570207-5cb9-4481-a15a-d0bb9312a84b-audit-dir\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.163636 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5d6a7d8-1840-4f2c-9fee-694a671f28cd-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-l82hk\" (UID: \"c5d6a7d8-1840-4f2c-9fee-694a671f28cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.164092 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-serving-cert\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.164489 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.164688 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d63bfb8-8ecc-43f3-8931-cc09c815c580-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.164982 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/79162e32-ee8c-4fcc-8911-0f95d41cd110-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.165448 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-etcd-client\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.167140 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.168482 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.168540 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-policies\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.169148 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7j8nw"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.169545 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-trusted-ca-bundle\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.173113 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5t8h7"] Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.174516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65b41ab6-6253-4ee5-87f2-50ed05610e03-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.175707 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.177773 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.178044 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-oauth-config\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.178078 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/275484db-b3bc-4027-a1d7-a67ab3c71439-serving-cert\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.178747 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6bae1b17-1679-4be9-9717-66c5a80ad425-serving-cert\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.179253 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.179373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.186290 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.190597 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b3570207-5cb9-4481-a15a-d0bb9312a84b-encryption-config\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.190710 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.214700 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.222326 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.240654 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.249781 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbce412e-616a-465b-bb42-da842edb8110-serving-cert\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.257984 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.263695 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbce412e-616a-465b-bb42-da842edb8110-etcd-client\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.278090 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.298793 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.303265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-config\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.319154 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.328697 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63fd2de1-85c4-4f01-8524-7b93c777592d-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.338918 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.344859 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-etcd-ca\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.358648 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.365001 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/bbce412e-616a-465b-bb42-da842edb8110-etcd-service-ca\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.378690 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.388525 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63fd2de1-85c4-4f01-8524-7b93c777592d-config\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.398728 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.419155 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.438274 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.458633 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.479116 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.486371 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47548d0b-9447-4862-b717-9427ae40c49a-metrics-tls\") pod \"dns-operator-744455d44c-pfxs9\" (UID: \"47548d0b-9447-4862-b717-9427ae40c49a\") " pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.498528 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.519163 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.538950 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.558668 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.579448 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.599228 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.618825 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.678015 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.698271 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.718864 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.753669 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbtnm\" (UniqueName: \"kubernetes.io/projected/a99225b3-64c7-4b39-807c-c97faa919977-kube-api-access-mbtnm\") pod \"machine-api-operator-5694c8668f-bhrll\" (UID: \"a99225b3-64c7-4b39-807c-c97faa919977\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.771007 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bts76\" (UniqueName: \"kubernetes.io/projected/330a8450-c400-425d-9a46-e868a02fca27-kube-api-access-bts76\") pod \"machine-approver-56656f9798-t27wz\" (UID: \"330a8450-c400-425d-9a46-e868a02fca27\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.779285 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.798511 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.833888 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2mg8\" (UniqueName: \"kubernetes.io/projected/419120bb-3f1b-4f21-adf5-ac057bd5dce6-kube-api-access-l2mg8\") pod \"controller-manager-879f6c89f-bjtnp\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.839013 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.848512 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.860154 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.880690 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.886873 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz48f\" (UniqueName: \"kubernetes.io/projected/1d49ef4e-91fb-4b98-89d9-65358c718967-kube-api-access-jz48f\") pod \"route-controller-manager-6576b87f9c-gnvxz\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.899820 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.918075 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.944901 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82jpq\" (UniqueName: \"kubernetes.io/projected/49deabd4-ebbe-4c07-bb79-105982db000a-kube-api-access-82jpq\") pod \"apiserver-76f77b778f-7zslm\" (UID: \"49deabd4-ebbe-4c07-bb79-105982db000a\") " pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.954396 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl4kb\" (UniqueName: \"kubernetes.io/projected/411d251b-6daa-4c45-9aeb-aa38def60a90-kube-api-access-sl4kb\") pod \"authentication-operator-69f744f599-4jhgc\" (UID: \"411d251b-6daa-4c45-9aeb-aa38def60a90\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.956300 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.961276 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.979588 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 20 18:32:26 crc kubenswrapper[4773]: I0120 18:32:26.999360 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.025459 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.038453 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-bhrll"] Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.039275 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: W0120 18:32:27.049676 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda99225b3_64c7_4b39_807c_c97faa919977.slice/crio-f3edff698e8920d644223729add474573aeda03b3fe9957c13f4beef2c8951cf WatchSource:0}: Error finding container f3edff698e8920d644223729add474573aeda03b3fe9957c13f4beef2c8951cf: Status 404 returned error can't find the container with id f3edff698e8920d644223729add474573aeda03b3fe9957c13f4beef2c8951cf Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.059480 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.078344 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bjtnp"] Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.078438 4773 request.go:700] Waited for 1.008339815s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-tls&limit=500&resourceVersion=0 Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.080466 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.099018 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.120843 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.133102 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz"] Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.138376 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.140670 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:27 crc kubenswrapper[4773]: W0120 18:32:27.149454 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d49ef4e_91fb_4b98_89d9_65358c718967.slice/crio-3d79ab25a80551aca70e1318ac4565dde71b20dd660b9be7dfd85ec349787632 WatchSource:0}: Error finding container 3d79ab25a80551aca70e1318ac4565dde71b20dd660b9be7dfd85ec349787632: Status 404 returned error can't find the container with id 3d79ab25a80551aca70e1318ac4565dde71b20dd660b9be7dfd85ec349787632 Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.158715 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.178812 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.199395 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.219753 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.227752 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.239229 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.259815 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.279228 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.298222 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.302222 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" event={"ID":"330a8450-c400-425d-9a46-e868a02fca27","Type":"ContainerStarted","Data":"b0881d57fbbc77ac5ee40bd606c8fb06314b176343932d84de3ec1f7a4c35da9"} Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.302856 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" event={"ID":"a99225b3-64c7-4b39-807c-c97faa919977","Type":"ContainerStarted","Data":"f3edff698e8920d644223729add474573aeda03b3fe9957c13f4beef2c8951cf"} Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.303583 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" event={"ID":"1d49ef4e-91fb-4b98-89d9-65358c718967","Type":"ContainerStarted","Data":"3d79ab25a80551aca70e1318ac4565dde71b20dd660b9be7dfd85ec349787632"} Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.304317 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" event={"ID":"419120bb-3f1b-4f21-adf5-ac057bd5dce6","Type":"ContainerStarted","Data":"0be8a6611d182b19e3c280dba8bf32b32d7fa146a5cc6f6279d1419ba167e1bf"} Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.314061 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-7zslm"] Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.318582 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 20 18:32:27 crc kubenswrapper[4773]: W0120 18:32:27.328748 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49deabd4_ebbe_4c07_bb79_105982db000a.slice/crio-96b4a9067e19b88b1b4f6c350554a1d910356eacf6ea77476af029d104e51a28 WatchSource:0}: Error finding container 96b4a9067e19b88b1b4f6c350554a1d910356eacf6ea77476af029d104e51a28: Status 404 returned error can't find the container with id 96b4a9067e19b88b1b4f6c350554a1d910356eacf6ea77476af029d104e51a28 Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.338511 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.357783 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.378620 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.394367 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4jhgc"] Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.399590 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.418385 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.438549 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.459641 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.479710 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.498757 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.518714 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.539296 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.558350 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.578316 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.599280 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.619005 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.638852 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.658875 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.678919 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.710136 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.719130 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.739853 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.758961 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.779826 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.799549 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.819111 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.842079 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: W0120 18:32:27.846616 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod411d251b_6daa_4c45_9aeb_aa38def60a90.slice/crio-4c54c299e0c6f6b716ff8374168304277c59e4843dd32e5109db1548d60a299e WatchSource:0}: Error finding container 4c54c299e0c6f6b716ff8374168304277c59e4843dd32e5109db1548d60a299e: Status 404 returned error can't find the container with id 4c54c299e0c6f6b716ff8374168304277c59e4843dd32e5109db1548d60a299e Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.877593 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.878374 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.898384 4773 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.919372 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.939220 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.958626 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.979724 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 20 18:32:27 crc kubenswrapper[4773]: I0120 18:32:27.999459 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.019644 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.038741 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.059029 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.097180 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzp5t\" (UniqueName: \"kubernetes.io/projected/65b41ab6-6253-4ee5-87f2-50ed05610e03-kube-api-access-hzp5t\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.097321 4773 request.go:700] Waited for 1.943418555s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.117760 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmmk6\" (UniqueName: \"kubernetes.io/projected/c5d6a7d8-1840-4f2c-9fee-694a671f28cd-kube-api-access-dmmk6\") pod \"cluster-samples-operator-665b6dd947-l82hk\" (UID: \"c5d6a7d8-1840-4f2c-9fee-694a671f28cd\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.133351 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dzn2\" (UniqueName: \"kubernetes.io/projected/47548d0b-9447-4862-b717-9427ae40c49a-kube-api-access-5dzn2\") pod \"dns-operator-744455d44c-pfxs9\" (UID: \"47548d0b-9447-4862-b717-9427ae40c49a\") " pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.158103 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsp4s\" (UniqueName: \"kubernetes.io/projected/6bae1b17-1679-4be9-9717-66c5a80ad425-kube-api-access-dsp4s\") pod \"console-operator-58897d9998-xwc5v\" (UID: \"6bae1b17-1679-4be9-9717-66c5a80ad425\") " pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.170803 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.170868 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.176048 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pq64n\" (UniqueName: \"kubernetes.io/projected/bbce412e-616a-465b-bb42-da842edb8110-kube-api-access-pq64n\") pod \"etcd-operator-b45778765-2wmrt\" (UID: \"bbce412e-616a-465b-bb42-da842edb8110\") " pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.192243 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5ft\" (UniqueName: \"kubernetes.io/projected/cf25ec9b-96c5-4129-958f-35acbc34a20d-kube-api-access-4p5ft\") pod \"oauth-openshift-558db77b4-xpsls\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.211088 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vqxj\" (UniqueName: \"kubernetes.io/projected/79162e32-ee8c-4fcc-8911-0f95d41cd110-kube-api-access-9vqxj\") pod \"openshift-controller-manager-operator-756b6f6bc6-n5dfl\" (UID: \"79162e32-ee8c-4fcc-8911-0f95d41cd110\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.222369 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.230058 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntrkt\" (UniqueName: \"kubernetes.io/projected/e98bf97b-784a-4a99-8eff-20e6fc687876-kube-api-access-ntrkt\") pod \"migrator-59844c95c7-c989h\" (UID: \"e98bf97b-784a-4a99-8eff-20e6fc687876\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.238051 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.245398 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.253791 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzjhr\" (UniqueName: \"kubernetes.io/projected/fec9cba4-b7cb-46ca-90a4-af0d5114fee8-kube-api-access-pzjhr\") pod \"downloads-7954f5f757-bkbfc\" (UID: \"fec9cba4-b7cb-46ca-90a4-af0d5114fee8\") " pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.265978 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.275678 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/65b41ab6-6253-4ee5-87f2-50ed05610e03-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-kw7s6\" (UID: \"65b41ab6-6253-4ee5-87f2-50ed05610e03\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.295728 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-v6wpc\" (UID: \"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.317506 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" event={"ID":"330a8450-c400-425d-9a46-e868a02fca27","Type":"ContainerStarted","Data":"d980837bf111ca1f394543e6e39fde1c816899c4e72eae4a1d98c93ac335fb11"} Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.319400 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" event={"ID":"a99225b3-64c7-4b39-807c-c97faa919977","Type":"ContainerStarted","Data":"79654ac68476f4932f494a5ac7dc8a6128258838b2997ca43f88dc63b1ac8fc0"} Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.321090 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" event={"ID":"49deabd4-ebbe-4c07-bb79-105982db000a","Type":"ContainerStarted","Data":"96b4a9067e19b88b1b4f6c350554a1d910356eacf6ea77476af029d104e51a28"} Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.322974 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" event={"ID":"1d49ef4e-91fb-4b98-89d9-65358c718967","Type":"ContainerStarted","Data":"bdc540ba6443004408ae797654a9bce71e44b1154765597934dc3ae831e5cade"} Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.323202 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.328106 4773 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gnvxz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.328163 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.329490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ddbv\" (UniqueName: \"kubernetes.io/projected/ba3736bb-3d36-4a0c-91fa-85f410849312-kube-api-access-8ddbv\") pod \"console-f9d7485db-9nh6h\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.331543 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" event={"ID":"419120bb-3f1b-4f21-adf5-ac057bd5dce6","Type":"ContainerStarted","Data":"0a049498de4a817a942a2b87af4cddb85094398aca02919519cc51c9033fb122"} Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.332707 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.333564 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" event={"ID":"411d251b-6daa-4c45-9aeb-aa38def60a90","Type":"ContainerStarted","Data":"4c54c299e0c6f6b716ff8374168304277c59e4843dd32e5109db1548d60a299e"} Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.334829 4773 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bjtnp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.334868 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.335115 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.347696 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/63fd2de1-85c4-4f01-8524-7b93c777592d-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-6bf74\" (UID: \"63fd2de1-85c4-4f01-8524-7b93c777592d\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.348132 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.353908 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rqrr\" (UniqueName: \"kubernetes.io/projected/275484db-b3bc-4027-a1d7-a67ab3c71439-kube-api-access-4rqrr\") pod \"openshift-config-operator-7777fb866f-fcrzv\" (UID: \"275484db-b3bc-4027-a1d7-a67ab3c71439\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.365258 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.374036 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrpqv\" (UniqueName: \"kubernetes.io/projected/b3570207-5cb9-4481-a15a-d0bb9312a84b-kube-api-access-xrpqv\") pod \"apiserver-7bbb656c7d-k2x4p\" (UID: \"b3570207-5cb9-4481-a15a-d0bb9312a84b\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.375555 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.384921 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.404167 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.404679 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7pps\" (UniqueName: \"kubernetes.io/projected/9d63bfb8-8ecc-43f3-8931-cc09c815c580-kube-api-access-k7pps\") pod \"openshift-apiserver-operator-796bbdcf4f-kpq6z\" (UID: \"9d63bfb8-8ecc-43f3-8931-cc09c815c580\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.477238 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk"] Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.492059 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xpsls"] Jan 20 18:32:28 crc kubenswrapper[4773]: I0120 18:32:28.537530 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xwc5v"] Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.430386 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.431067 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.431206 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.431421 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.434361 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.439218 4773 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gnvxz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.439268 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.440717 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.440826 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f751520b-bf3d-4226-8850-4b3346c43a6f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.440870 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-certificates\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.441108 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-tls\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.442657 4773 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bjtnp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.442682 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.452227 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:29.952195177 +0000 UTC m=+142.874008211 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.539364 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" podStartSLOduration=122.539333255 podStartE2EDuration="2m2.539333255s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:29.532193639 +0000 UTC m=+142.454006703" watchObservedRunningTime="2026-01-20 18:32:29.539333255 +0000 UTC m=+142.461146289" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.543498 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.543718 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cs785\" (UniqueName: \"kubernetes.io/projected/00a9d467-1154-4eae-b1e5-19dfbb214a80-kube-api-access-cs785\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.543775 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-metrics-certs\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.543860 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.043828576 +0000 UTC m=+142.965641610 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.543905 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00a9d467-1154-4eae-b1e5-19dfbb214a80-service-ca-bundle\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.543968 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-trusted-ca\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.543994 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-stats-auth\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544029 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544071 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f751520b-bf3d-4226-8850-4b3346c43a6f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544129 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-certificates\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544185 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-bound-sa-token\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544212 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-tls\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544238 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f751520b-bf3d-4226-8850-4b3346c43a6f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544260 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ss7d9\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-kube-api-access-ss7d9\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544311 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-default-certificate\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.544765 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f751520b-bf3d-4226-8850-4b3346c43a6f-ca-trust-extracted\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.545676 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.045653861 +0000 UTC m=+142.967466895 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.546078 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-certificates\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.553851 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-tls\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.649189 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.649491 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-metrics-certs\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.649561 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b703025-44fd-42d1-81fa-27ef31c9d2fb-cert\") pod \"ingress-canary-5t8h7\" (UID: \"2b703025-44fd-42d1-81fa-27ef31c9d2fb\") " pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.650151 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.150127657 +0000 UTC m=+143.071940691 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650708 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-metrics-tls\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650735 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00a9d467-1154-4eae-b1e5-19dfbb214a80-service-ca-bundle\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650753 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12a1e676-da4c-46d2-a8f6-11dedde983fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650785 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/027ba59d-f4ba-430f-af60-a7f293dd2052-signing-key\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650824 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-stats-auth\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650847 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90ec3f02-fbee-4465-b262-28b2b475e2b9-srv-cert\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-mountpoint-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.650921 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651028 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9cee99f1-8905-4089-be36-90af1426d834-images\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651069 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbbfh\" (UniqueName: \"kubernetes.io/projected/22f987ee-958e-41a1-8cf4-ef0da8212364-kube-api-access-gbbfh\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651090 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvzks\" (UniqueName: \"kubernetes.io/projected/1e5ac136-d46c-45e3-9a5f-548ac22fac5c-kube-api-access-pvzks\") pod \"control-plane-machine-set-operator-78cbb6b69f-857hw\" (UID: \"1e5ac136-d46c-45e3-9a5f-548ac22fac5c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651106 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4cpld\" (UniqueName: \"kubernetes.io/projected/90ec3f02-fbee-4465-b262-28b2b475e2b9-kube-api-access-4cpld\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651125 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f0fde2-da58-4350-ad67-cb29a2684875-config-volume\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651144 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651160 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvmf4\" (UniqueName: \"kubernetes.io/projected/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-kube-api-access-gvmf4\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651186 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nx698\" (UniqueName: \"kubernetes.io/projected/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-kube-api-access-nx698\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651204 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wx9mw\" (UniqueName: \"kubernetes.io/projected/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-kube-api-access-wx9mw\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651238 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cee99f1-8905-4089-be36-90af1426d834-auth-proxy-config\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651289 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71dd00dd-f11c-43a8-b7a2-2416a1761d94-config\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651305 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12a1e676-da4c-46d2-a8f6-11dedde983fc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.651347 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4n8m\" (UniqueName: \"kubernetes.io/projected/71dd00dd-f11c-43a8-b7a2-2416a1761d94-kube-api-access-w4n8m\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.651811 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.151791957 +0000 UTC m=+143.073604981 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652538 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-csi-data-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652579 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-certs\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652606 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a27e80d9-dea2-4e87-90c8-1c69288cfa55-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xwzh9\" (UID: \"a27e80d9-dea2-4e87-90c8-1c69288cfa55\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652655 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a1e676-da4c-46d2-a8f6-11dedde983fc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652683 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f0fde2-da58-4350-ad67-cb29a2684875-metrics-tls\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652796 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f751520b-bf3d-4226-8850-4b3346c43a6f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652873 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-default-certificate\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652910 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90ec3f02-fbee-4465-b262-28b2b475e2b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652953 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14f80aee-1c1f-4beb-a280-3ac021e920c9-webhook-cert\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.652999 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/22f987ee-958e-41a1-8cf4-ef0da8212364-profile-collector-cert\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.653020 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-registration-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.653787 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-socket-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.653833 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deccf4fe-9230-4e96-b16c-a2ed0d2235a7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x6fwb\" (UID: \"deccf4fe-9230-4e96-b16c-a2ed0d2235a7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.653867 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.654021 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14f80aee-1c1f-4beb-a280-3ac021e920c9-tmpfs\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.654042 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14f80aee-1c1f-4beb-a280-3ac021e920c9-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.654087 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-trusted-ca\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.654118 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/22f987ee-958e-41a1-8cf4-ef0da8212364-srv-cert\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.654142 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpbxx\" (UniqueName: \"kubernetes.io/projected/deccf4fe-9230-4e96-b16c-a2ed0d2235a7-kube-api-access-kpbxx\") pod \"multus-admission-controller-857f4d67dd-x6fwb\" (UID: \"deccf4fe-9230-4e96-b16c-a2ed0d2235a7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.656426 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f751520b-bf3d-4226-8850-4b3346c43a6f-installation-pull-secrets\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.656888 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-node-bootstrap-token\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.657618 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-trusted-ca\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.657975 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-secret-volume\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.658011 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-kube-api-access-kfrvq\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.658040 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-plugins-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659078 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxslf\" (UniqueName: \"kubernetes.io/projected/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-kube-api-access-zxslf\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659200 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fcmz\" (UniqueName: \"kubernetes.io/projected/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-kube-api-access-2fcmz\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659250 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659289 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-config-volume\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659478 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvzx2\" (UniqueName: \"kubernetes.io/projected/14f80aee-1c1f-4beb-a280-3ac021e920c9-kube-api-access-gvzx2\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659649 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfqbt\" (UniqueName: \"kubernetes.io/projected/027ba59d-f4ba-430f-af60-a7f293dd2052-kube-api-access-wfqbt\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.659891 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t86dq\" (UniqueName: \"kubernetes.io/projected/c5d6700e-54f1-4f09-83d7-e85f66af8c85-kube-api-access-t86dq\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.660366 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-bound-sa-token\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.662330 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-56md4\" (UniqueName: \"kubernetes.io/projected/a27e80d9-dea2-4e87-90c8-1c69288cfa55-kube-api-access-56md4\") pod \"package-server-manager-789f6589d5-xwzh9\" (UID: \"a27e80d9-dea2-4e87-90c8-1c69288cfa55\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.664017 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-trusted-ca\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.664460 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r74z\" (UniqueName: \"kubernetes.io/projected/86f0fde2-da58-4350-ad67-cb29a2684875-kube-api-access-4r74z\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.664554 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71dd00dd-f11c-43a8-b7a2-2416a1761d94-serving-cert\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.664702 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665188 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ss7d9\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-kube-api-access-ss7d9\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665237 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e5ac136-d46c-45e3-9a5f-548ac22fac5c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-857hw\" (UID: \"1e5ac136-d46c-45e3-9a5f-548ac22fac5c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665273 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fc8nd\" (UniqueName: \"kubernetes.io/projected/9cee99f1-8905-4089-be36-90af1426d834-kube-api-access-fc8nd\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-proxy-tls\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665337 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92kdg\" (UniqueName: \"kubernetes.io/projected/2b703025-44fd-42d1-81fa-27ef31c9d2fb-kube-api-access-92kdg\") pod \"ingress-canary-5t8h7\" (UID: \"2b703025-44fd-42d1-81fa-27ef31c9d2fb\") " pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665389 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/027ba59d-f4ba-430f-af60-a7f293dd2052-signing-cabundle\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665468 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cee99f1-8905-4089-be36-90af1426d834-proxy-tls\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665528 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.665639 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.666497 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cs785\" (UniqueName: \"kubernetes.io/projected/00a9d467-1154-4eae-b1e5-19dfbb214a80-kube-api-access-cs785\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.678154 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-bound-sa-token\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.695994 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/00a9d467-1154-4eae-b1e5-19dfbb214a80-service-ca-bundle\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.696974 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-metrics-certs\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.698020 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ss7d9\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-kube-api-access-ss7d9\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.698400 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-stats-auth\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.698659 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/00a9d467-1154-4eae-b1e5-19dfbb214a80-default-certificate\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.701219 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cs785\" (UniqueName: \"kubernetes.io/projected/00a9d467-1154-4eae-b1e5-19dfbb214a80-kube-api-access-cs785\") pod \"router-default-5444994796-x95ml\" (UID: \"00a9d467-1154-4eae-b1e5-19dfbb214a80\") " pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.708917 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" podStartSLOduration=123.708834413 podStartE2EDuration="2m3.708834413s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:29.706451715 +0000 UTC m=+142.628264779" watchObservedRunningTime="2026-01-20 18:32:29.708834413 +0000 UTC m=+142.630647457" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.772392 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.772850 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.27282677 +0000 UTC m=+143.194639794 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.772960 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90ec3f02-fbee-4465-b262-28b2b475e2b9-srv-cert\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773020 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-mountpoint-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773054 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773090 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9cee99f1-8905-4089-be36-90af1426d834-images\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773113 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gbbfh\" (UniqueName: \"kubernetes.io/projected/22f987ee-958e-41a1-8cf4-ef0da8212364-kube-api-access-gbbfh\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f0fde2-da58-4350-ad67-cb29a2684875-config-volume\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773173 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvzks\" (UniqueName: \"kubernetes.io/projected/1e5ac136-d46c-45e3-9a5f-548ac22fac5c-kube-api-access-pvzks\") pod \"control-plane-machine-set-operator-78cbb6b69f-857hw\" (UID: \"1e5ac136-d46c-45e3-9a5f-548ac22fac5c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4cpld\" (UniqueName: \"kubernetes.io/projected/90ec3f02-fbee-4465-b262-28b2b475e2b9-kube-api-access-4cpld\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773238 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773254 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvmf4\" (UniqueName: \"kubernetes.io/projected/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-kube-api-access-gvmf4\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773271 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wx9mw\" (UniqueName: \"kubernetes.io/projected/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-kube-api-access-wx9mw\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773288 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nx698\" (UniqueName: \"kubernetes.io/projected/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-kube-api-access-nx698\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773307 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cee99f1-8905-4089-be36-90af1426d834-auth-proxy-config\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773327 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71dd00dd-f11c-43a8-b7a2-2416a1761d94-config\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773347 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12a1e676-da4c-46d2-a8f6-11dedde983fc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773379 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4n8m\" (UniqueName: \"kubernetes.io/projected/71dd00dd-f11c-43a8-b7a2-2416a1761d94-kube-api-access-w4n8m\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773405 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-csi-data-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773429 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a1e676-da4c-46d2-a8f6-11dedde983fc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f0fde2-da58-4350-ad67-cb29a2684875-metrics-tls\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773464 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-certs\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773484 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a27e80d9-dea2-4e87-90c8-1c69288cfa55-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xwzh9\" (UID: \"a27e80d9-dea2-4e87-90c8-1c69288cfa55\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773511 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90ec3f02-fbee-4465-b262-28b2b475e2b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773529 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14f80aee-1c1f-4beb-a280-3ac021e920c9-webhook-cert\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773546 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-registration-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773561 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/22f987ee-958e-41a1-8cf4-ef0da8212364-profile-collector-cert\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773577 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-socket-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deccf4fe-9230-4e96-b16c-a2ed0d2235a7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x6fwb\" (UID: \"deccf4fe-9230-4e96-b16c-a2ed0d2235a7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773607 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773642 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14f80aee-1c1f-4beb-a280-3ac021e920c9-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773662 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14f80aee-1c1f-4beb-a280-3ac021e920c9-tmpfs\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773683 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kpbxx\" (UniqueName: \"kubernetes.io/projected/deccf4fe-9230-4e96-b16c-a2ed0d2235a7-kube-api-access-kpbxx\") pod \"multus-admission-controller-857f4d67dd-x6fwb\" (UID: \"deccf4fe-9230-4e96-b16c-a2ed0d2235a7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773701 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/22f987ee-958e-41a1-8cf4-ef0da8212364-srv-cert\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773719 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-node-bootstrap-token\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773741 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-kube-api-access-kfrvq\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773759 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-plugins-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773776 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-secret-volume\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773797 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxslf\" (UniqueName: \"kubernetes.io/projected/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-kube-api-access-zxslf\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773819 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773838 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fcmz\" (UniqueName: \"kubernetes.io/projected/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-kube-api-access-2fcmz\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773865 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-config-volume\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773879 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvzx2\" (UniqueName: \"kubernetes.io/projected/14f80aee-1c1f-4beb-a280-3ac021e920c9-kube-api-access-gvzx2\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773902 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wfqbt\" (UniqueName: \"kubernetes.io/projected/027ba59d-f4ba-430f-af60-a7f293dd2052-kube-api-access-wfqbt\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773917 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t86dq\" (UniqueName: \"kubernetes.io/projected/c5d6700e-54f1-4f09-83d7-e85f66af8c85-kube-api-access-t86dq\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773952 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-56md4\" (UniqueName: \"kubernetes.io/projected/a27e80d9-dea2-4e87-90c8-1c69288cfa55-kube-api-access-56md4\") pod \"package-server-manager-789f6589d5-xwzh9\" (UID: \"a27e80d9-dea2-4e87-90c8-1c69288cfa55\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773967 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-trusted-ca\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773981 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r74z\" (UniqueName: \"kubernetes.io/projected/86f0fde2-da58-4350-ad67-cb29a2684875-kube-api-access-4r74z\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.773997 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71dd00dd-f11c-43a8-b7a2-2416a1761d94-serving-cert\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774013 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774030 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-proxy-tls\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774045 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92kdg\" (UniqueName: \"kubernetes.io/projected/2b703025-44fd-42d1-81fa-27ef31c9d2fb-kube-api-access-92kdg\") pod \"ingress-canary-5t8h7\" (UID: \"2b703025-44fd-42d1-81fa-27ef31c9d2fb\") " pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774065 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e5ac136-d46c-45e3-9a5f-548ac22fac5c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-857hw\" (UID: \"1e5ac136-d46c-45e3-9a5f-548ac22fac5c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774085 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fc8nd\" (UniqueName: \"kubernetes.io/projected/9cee99f1-8905-4089-be36-90af1426d834-kube-api-access-fc8nd\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774104 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/027ba59d-f4ba-430f-af60-a7f293dd2052-signing-cabundle\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774119 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cee99f1-8905-4089-be36-90af1426d834-proxy-tls\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774133 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774150 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774190 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-metrics-tls\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774205 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b703025-44fd-42d1-81fa-27ef31c9d2fb-cert\") pod \"ingress-canary-5t8h7\" (UID: \"2b703025-44fd-42d1-81fa-27ef31c9d2fb\") " pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12a1e676-da4c-46d2-a8f6-11dedde983fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774244 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/027ba59d-f4ba-430f-af60-a7f293dd2052-signing-key\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.774727 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-mountpoint-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.775265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9cee99f1-8905-4089-be36-90af1426d834-images\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.775495 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.275486346 +0000 UTC m=+143.197299360 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.776434 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9cee99f1-8905-4089-be36-90af1426d834-auth-proxy-config\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.777035 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-socket-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.777118 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-registration-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.777784 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/86f0fde2-da58-4350-ad67-cb29a2684875-config-volume\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.778167 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-plugins-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.785670 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.787517 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/c5d6700e-54f1-4f09-83d7-e85f66af8c85-csi-data-dir\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.787687 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-node-bootstrap-token\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.788985 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/027ba59d-f4ba-430f-af60-a7f293dd2052-signing-cabundle\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.791649 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71dd00dd-f11c-43a8-b7a2-2416a1761d94-config\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.792110 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/14f80aee-1c1f-4beb-a280-3ac021e920c9-webhook-cert\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.792673 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-config-volume\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.793188 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12a1e676-da4c-46d2-a8f6-11dedde983fc-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.793807 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a27e80d9-dea2-4e87-90c8-1c69288cfa55-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-xwzh9\" (UID: \"a27e80d9-dea2-4e87-90c8-1c69288cfa55\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.794625 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/90ec3f02-fbee-4465-b262-28b2b475e2b9-srv-cert\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.795309 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deccf4fe-9230-4e96-b16c-a2ed0d2235a7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-x6fwb\" (UID: \"deccf4fe-9230-4e96-b16c-a2ed0d2235a7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.795717 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.795989 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/86f0fde2-da58-4350-ad67-cb29a2684875-metrics-tls\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.806690 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-trusted-ca\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.810558 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.811143 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.811576 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.811600 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12a1e676-da4c-46d2-a8f6-11dedde983fc-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.812126 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/90ec3f02-fbee-4465-b262-28b2b475e2b9-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.812221 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/14f80aee-1c1f-4beb-a280-3ac021e920c9-apiservice-cert\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.816269 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvzks\" (UniqueName: \"kubernetes.io/projected/1e5ac136-d46c-45e3-9a5f-548ac22fac5c-kube-api-access-pvzks\") pod \"control-plane-machine-set-operator-78cbb6b69f-857hw\" (UID: \"1e5ac136-d46c-45e3-9a5f-548ac22fac5c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.818716 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71dd00dd-f11c-43a8-b7a2-2416a1761d94-serving-cert\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.819148 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/027ba59d-f4ba-430f-af60-a7f293dd2052-signing-key\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.821951 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-secret-volume\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.822906 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/22f987ee-958e-41a1-8cf4-ef0da8212364-profile-collector-cert\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.827706 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/22f987ee-958e-41a1-8cf4-ef0da8212364-srv-cert\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.828984 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-certs\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.820235 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/14f80aee-1c1f-4beb-a280-3ac021e920c9-tmpfs\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.834129 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1e5ac136-d46c-45e3-9a5f-548ac22fac5c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-857hw\" (UID: \"1e5ac136-d46c-45e3-9a5f-548ac22fac5c\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.834142 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nx698\" (UniqueName: \"kubernetes.io/projected/2e18f8ef-154a-4077-9f3c-a6979d9cbe0b-kube-api-access-nx698\") pod \"machine-config-server-cv7zc\" (UID: \"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b\") " pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.835651 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9cee99f1-8905-4089-be36-90af1426d834-proxy-tls\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.836235 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wx9mw\" (UniqueName: \"kubernetes.io/projected/19da63c0-7e43-4bb0-a8fb-590722ea7cf2-kube-api-access-wx9mw\") pod \"kube-storage-version-migrator-operator-b67b599dd-87l5s\" (UID: \"19da63c0-7e43-4bb0-a8fb-590722ea7cf2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.843259 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2b703025-44fd-42d1-81fa-27ef31c9d2fb-cert\") pod \"ingress-canary-5t8h7\" (UID: \"2b703025-44fd-42d1-81fa-27ef31c9d2fb\") " pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.844966 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-proxy-tls\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.856200 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbbfh\" (UniqueName: \"kubernetes.io/projected/22f987ee-958e-41a1-8cf4-ef0da8212364-kube-api-access-gbbfh\") pod \"catalog-operator-68c6474976-mrfcm\" (UID: \"22f987ee-958e-41a1-8cf4-ef0da8212364\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.856655 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-metrics-tls\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.859724 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4cpld\" (UniqueName: \"kubernetes.io/projected/90ec3f02-fbee-4465-b262-28b2b475e2b9-kube-api-access-4cpld\") pod \"olm-operator-6b444d44fb-lqvwq\" (UID: \"90ec3f02-fbee-4465-b262-28b2b475e2b9\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.874844 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.876594 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.376572578 +0000 UTC m=+143.298385602 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.885876 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r74z\" (UniqueName: \"kubernetes.io/projected/86f0fde2-da58-4350-ad67-cb29a2684875-kube-api-access-4r74z\") pod \"dns-default-7j8nw\" (UID: \"86f0fde2-da58-4350-ad67-cb29a2684875\") " pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.894940 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.895162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-kube-api-access-kfrvq\") pod \"marketplace-operator-79b997595-ff9dd\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.930222 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.943326 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.953065 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.956702 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvmf4\" (UniqueName: \"kubernetes.io/projected/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-kube-api-access-gvmf4\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.957211 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxslf\" (UniqueName: \"kubernetes.io/projected/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-kube-api-access-zxslf\") pod \"collect-profiles-29482230-pqppn\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.965347 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.967191 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-9nh6h"] Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.970186 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/aab7784b-df99-4fd2-b2ea-3d2f6cdb098c-bound-sa-token\") pod \"ingress-operator-5b745b69d9-llx65\" (UID: \"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.973153 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.992865 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:29 crc kubenswrapper[4773]: I0120 18:32:29.993250 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:29 crc kubenswrapper[4773]: E0120 18:32:29.993650 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.493636674 +0000 UTC m=+143.415449698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.043190 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-cv7zc" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.050324 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.054986 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpbxx\" (UniqueName: \"kubernetes.io/projected/deccf4fe-9230-4e96-b16c-a2ed0d2235a7-kube-api-access-kpbxx\") pod \"multus-admission-controller-857f4d67dd-x6fwb\" (UID: \"deccf4fe-9230-4e96-b16c-a2ed0d2235a7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.054920 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fcmz\" (UniqueName: \"kubernetes.io/projected/70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d-kube-api-access-2fcmz\") pod \"machine-config-controller-84d6567774-6qw48\" (UID: \"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.094452 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.095049 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.595027922 +0000 UTC m=+143.516840946 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.109095 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4n8m\" (UniqueName: \"kubernetes.io/projected/71dd00dd-f11c-43a8-b7a2-2416a1761d94-kube-api-access-w4n8m\") pod \"service-ca-operator-777779d784-dfqb5\" (UID: \"71dd00dd-f11c-43a8-b7a2-2416a1761d94\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.124561 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.126669 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t86dq\" (UniqueName: \"kubernetes.io/projected/c5d6700e-54f1-4f09-83d7-e85f66af8c85-kube-api-access-t86dq\") pod \"csi-hostpathplugin-w8vpz\" (UID: \"c5d6700e-54f1-4f09-83d7-e85f66af8c85\") " pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.129795 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfqbt\" (UniqueName: \"kubernetes.io/projected/027ba59d-f4ba-430f-af60-a7f293dd2052-kube-api-access-wfqbt\") pod \"service-ca-9c57cc56f-ks6ps\" (UID: \"027ba59d-f4ba-430f-af60-a7f293dd2052\") " pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.140908 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvzx2\" (UniqueName: \"kubernetes.io/projected/14f80aee-1c1f-4beb-a280-3ac021e920c9-kube-api-access-gvzx2\") pod \"packageserver-d55dfcdfc-cfslv\" (UID: \"14f80aee-1c1f-4beb-a280-3ac021e920c9\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.145129 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-56md4\" (UniqueName: \"kubernetes.io/projected/a27e80d9-dea2-4e87-90c8-1c69288cfa55-kube-api-access-56md4\") pod \"package-server-manager-789f6589d5-xwzh9\" (UID: \"a27e80d9-dea2-4e87-90c8-1c69288cfa55\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.145532 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/12a1e676-da4c-46d2-a8f6-11dedde983fc-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-425qm\" (UID: \"12a1e676-da4c-46d2-a8f6-11dedde983fc\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.152619 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fc8nd\" (UniqueName: \"kubernetes.io/projected/9cee99f1-8905-4089-be36-90af1426d834-kube-api-access-fc8nd\") pod \"machine-config-operator-74547568cd-snlnq\" (UID: \"9cee99f1-8905-4089-be36-90af1426d834\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.192061 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92kdg\" (UniqueName: \"kubernetes.io/projected/2b703025-44fd-42d1-81fa-27ef31c9d2fb-kube-api-access-92kdg\") pod \"ingress-canary-5t8h7\" (UID: \"2b703025-44fd-42d1-81fa-27ef31c9d2fb\") " pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.197714 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.198056 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.698042332 +0000 UTC m=+143.619855356 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.226792 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.227249 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.290218 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.299643 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.299970 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.799953894 +0000 UTC m=+143.721766918 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.307439 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.317453 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.323618 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.328486 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv"] Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.329706 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.361220 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.362030 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-bkbfc"] Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.400704 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.401045 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:30.901030816 +0000 UTC m=+143.822843840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.404493 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.415406 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5t8h7" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.451532 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" event={"ID":"cf25ec9b-96c5-4129-958f-35acbc34a20d","Type":"ContainerStarted","Data":"4f519175ce1269f87277348e7dbf3cb7cac77cd634d740bc906a3ed7230ae289"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.459610 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" event={"ID":"a99225b3-64c7-4b39-807c-c97faa919977","Type":"ContainerStarted","Data":"33ccc5f46ace640fd63928e13135a80b2c741299077595b13ba47c85c98cc041"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.479201 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" event={"ID":"330a8450-c400-425d-9a46-e868a02fca27","Type":"ContainerStarted","Data":"56724ec1814776d34f41e4bc888bb2f625416edf2dd917631378c1d20dace01d"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.501996 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.503199 4773 csr.go:261] certificate signing request csr-kd9nr is approved, waiting to be issued Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.504697 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.00467265 +0000 UTC m=+143.926485674 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.514683 4773 csr.go:257] certificate signing request csr-kd9nr is issued Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.521276 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" event={"ID":"6bae1b17-1679-4be9-9717-66c5a80ad425","Type":"ContainerStarted","Data":"c18851e3c7438f7238fd35df3adacffca533ec4c2ab48865a2762a6813ac4b07"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.521316 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" event={"ID":"6bae1b17-1679-4be9-9717-66c5a80ad425","Type":"ContainerStarted","Data":"d7718bdf724d16f5015d86421784af9c2854acd759c9fc1ebd71bab194622459"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.521782 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.540477 4773 generic.go:334] "Generic (PLEG): container finished" podID="49deabd4-ebbe-4c07-bb79-105982db000a" containerID="baa5478ba640cda543e6ae852f7964fed77445e700440fdfc2fde3fb0f5449ae" exitCode=0 Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.540551 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" event={"ID":"49deabd4-ebbe-4c07-bb79-105982db000a","Type":"ContainerDied","Data":"baa5478ba640cda543e6ae852f7964fed77445e700440fdfc2fde3fb0f5449ae"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.547546 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x95ml" event={"ID":"00a9d467-1154-4eae-b1e5-19dfbb214a80","Type":"ContainerStarted","Data":"c0193781aec3b0e7b7da2b2e0e33d420b0bb0a76805b05f74d9b55ef584d52cb"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.553095 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" event={"ID":"411d251b-6daa-4c45-9aeb-aa38def60a90","Type":"ContainerStarted","Data":"6ca7a9b933f7fd2bb8b345e043d89d1ac113240c60a3aa639ec42e057c1be12b"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.558248 4773 patch_prober.go:28] interesting pod/console-operator-58897d9998-xwc5v container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" start-of-body= Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.558307 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" podUID="6bae1b17-1679-4be9-9717-66c5a80ad425" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.17:8443/readyz\": dial tcp 10.217.0.17:8443: connect: connection refused" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.559946 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" event={"ID":"c5d6a7d8-1840-4f2c-9fee-694a671f28cd","Type":"ContainerStarted","Data":"e44f11bab386417c3b1877eca6ca0ea6e968d94a4f4dc89a6c32f3e76ff34c9d"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.559976 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" event={"ID":"c5d6a7d8-1840-4f2c-9fee-694a671f28cd","Type":"ContainerStarted","Data":"7a46a714094ecc49f930d8d4bd7c16a60cb470db4a63635254dca73bbbd15dc9"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.560915 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cv7zc" event={"ID":"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b","Type":"ContainerStarted","Data":"d02e93dbd4aa6093aa2aaed871798afaa3fde7066996fc90ff89276acd6fc5df"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.573461 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9nh6h" event={"ID":"ba3736bb-3d36-4a0c-91fa-85f410849312","Type":"ContainerStarted","Data":"e7d6c34d2f903961f01ee5fdb975fd359e3c9fc20a9e900b7c3df54dd10fd2d7"} Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.590698 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.616821 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.622220 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.122206768 +0000 UTC m=+144.044019782 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.721686 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.731440 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.231407399 +0000 UTC m=+144.153220423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.793654 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" podStartSLOduration=124.793631383 podStartE2EDuration="2m4.793631383s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:30.756075827 +0000 UTC m=+143.677888851" watchObservedRunningTime="2026-01-20 18:32:30.793631383 +0000 UTC m=+143.715444407" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.796512 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-bhrll" podStartSLOduration=123.796495403 podStartE2EDuration="2m3.796495403s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:30.793407907 +0000 UTC m=+143.715220931" watchObservedRunningTime="2026-01-20 18:32:30.796495403 +0000 UTC m=+143.718308427" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.824869 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.825595 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.325470008 +0000 UTC m=+144.247283032 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.905002 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jhgc" podStartSLOduration=124.904969688 podStartE2EDuration="2m4.904969688s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:30.892516281 +0000 UTC m=+143.814329305" watchObservedRunningTime="2026-01-20 18:32:30.904969688 +0000 UTC m=+143.826782732" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.905690 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-t27wz" podStartSLOduration=124.905680655 podStartE2EDuration="2m4.905680655s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:30.841423681 +0000 UTC m=+143.763236725" watchObservedRunningTime="2026-01-20 18:32:30.905680655 +0000 UTC m=+143.827493679" Jan 20 18:32:30 crc kubenswrapper[4773]: I0120 18:32:30.926185 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:30 crc kubenswrapper[4773]: E0120 18:32:30.927233 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.427214756 +0000 UTC m=+144.349027780 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.029673 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.030033 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.53002097 +0000 UTC m=+144.451833994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.131244 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.131836 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.631813509 +0000 UTC m=+144.553626533 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.233892 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.234569 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.734550651 +0000 UTC m=+144.656363685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.335886 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.336319 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.83629999 +0000 UTC m=+144.758113014 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.438639 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.441878 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:31.941861661 +0000 UTC m=+144.863674685 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.520043 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-20 18:27:30 +0000 UTC, rotation deadline is 2026-10-16 23:47:11.334024711 +0000 UTC Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.520586 4773 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6461h14m39.813441539s for next certificate rotation Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.540657 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.541161 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.041121588 +0000 UTC m=+144.962934622 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.638202 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-cv7zc" event={"ID":"2e18f8ef-154a-4077-9f3c-a6979d9cbe0b","Type":"ContainerStarted","Data":"9e6a94dcd69ec5878174d7da81e29ba694ed10cd4152d26082c28ea6eac57b95"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.638435 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-x95ml" event={"ID":"00a9d467-1154-4eae-b1e5-19dfbb214a80","Type":"ContainerStarted","Data":"b93bb8376800a790e2bc5f26d8a6290473ff1ef1cd0d8d69a65c430c24fec5a9"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.638524 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" event={"ID":"cf25ec9b-96c5-4129-958f-35acbc34a20d","Type":"ContainerStarted","Data":"87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.638612 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.642538 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.643099 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.143051551 +0000 UTC m=+145.064864805 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.655539 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" event={"ID":"c5d6a7d8-1840-4f2c-9fee-694a671f28cd","Type":"ContainerStarted","Data":"eb06841b967a9ee52b36b5866eaa4628fa486a3492924ae7f602afbad102b81c"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.670069 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bkbfc" event={"ID":"fec9cba4-b7cb-46ca-90a4-af0d5114fee8","Type":"ContainerStarted","Data":"92ed1cdedc72cff75793979290982d2a68870b42701fb4301823c687477e5622"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.670146 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-bkbfc" event={"ID":"fec9cba4-b7cb-46ca-90a4-af0d5114fee8","Type":"ContainerStarted","Data":"b37f5ddb904bef473f681c1f2ad91b1594e544b7305690cf7e2a0afd8ded483b"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.670688 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.673770 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-bkbfc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.673860 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bkbfc" podUID="fec9cba4-b7cb-46ca-90a4-af0d5114fee8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.675010 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" event={"ID":"49deabd4-ebbe-4c07-bb79-105982db000a","Type":"ContainerStarted","Data":"16aa457ee6765bc06137ae2471c969825987135af2a1af7ccb6ac13745a1cc94"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.681956 4773 generic.go:334] "Generic (PLEG): container finished" podID="275484db-b3bc-4027-a1d7-a67ab3c71439" containerID="14f73794a9b75752413743e6602360dfe037565dcc05bceef502d47eecf8267e" exitCode=0 Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.682080 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" event={"ID":"275484db-b3bc-4027-a1d7-a67ab3c71439","Type":"ContainerDied","Data":"14f73794a9b75752413743e6602360dfe037565dcc05bceef502d47eecf8267e"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.682131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" event={"ID":"275484db-b3bc-4027-a1d7-a67ab3c71439","Type":"ContainerStarted","Data":"3c86239b4f8f8ff2b2d882db7a139f7059fdb529dde2ab6cd2d6fe31191f35b6"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.692575 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-cv7zc" podStartSLOduration=6.692546791 podStartE2EDuration="6.692546791s" podCreationTimestamp="2026-01-20 18:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:31.662314726 +0000 UTC m=+144.584127750" watchObservedRunningTime="2026-01-20 18:32:31.692546791 +0000 UTC m=+144.614359815" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.694326 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-x95ml" podStartSLOduration=125.694320135 podStartE2EDuration="2m5.694320135s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:31.689548137 +0000 UTC m=+144.611361161" watchObservedRunningTime="2026-01-20 18:32:31.694320135 +0000 UTC m=+144.616133159" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.701430 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9nh6h" event={"ID":"ba3736bb-3d36-4a0c-91fa-85f410849312","Type":"ContainerStarted","Data":"9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367"} Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.723754 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" podStartSLOduration=125.72373283 podStartE2EDuration="2m5.72373283s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:31.723532194 +0000 UTC m=+144.645345208" watchObservedRunningTime="2026-01-20 18:32:31.72373283 +0000 UTC m=+144.645545854" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.728743 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-xwc5v" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.750202 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.751828 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.251810012 +0000 UTC m=+145.173623036 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.783640 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-l82hk" podStartSLOduration=125.783616725 podStartE2EDuration="2m5.783616725s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:31.752463068 +0000 UTC m=+144.674276092" watchObservedRunningTime="2026-01-20 18:32:31.783616725 +0000 UTC m=+144.705429749" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.815749 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-9nh6h" podStartSLOduration=125.815720767 podStartE2EDuration="2m5.815720767s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:31.812413726 +0000 UTC m=+144.734226760" watchObservedRunningTime="2026-01-20 18:32:31.815720767 +0000 UTC m=+144.737533791" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.852224 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.857803 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.357787114 +0000 UTC m=+145.279600128 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.879041 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-bkbfc" podStartSLOduration=125.878999726 podStartE2EDuration="2m5.878999726s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:31.87874857 +0000 UTC m=+144.800561614" watchObservedRunningTime="2026-01-20 18:32:31.878999726 +0000 UTC m=+144.800812760" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.896283 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.904387 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:31 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:31 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:31 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.904452 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.959699 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.960012 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.459921481 +0000 UTC m=+145.381734515 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:31 crc kubenswrapper[4773]: I0120 18:32:31.960375 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:31 crc kubenswrapper[4773]: E0120 18:32:31.968412 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.46840289 +0000 UTC m=+145.390215914 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.044312 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-pfxs9"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.074486 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.074918 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.574899285 +0000 UTC m=+145.496712299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.090115 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.115414 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.179619 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.180121 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.680103718 +0000 UTC m=+145.601916742 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.223486 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-2wmrt"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.246177 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.258054 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.280610 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.281658 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.781201581 +0000 UTC m=+145.703014615 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.310187 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod65b41ab6_6253_4ee5_87f2_50ed05610e03.slice/crio-304f02b250bfe29aa79e928ee9066f11d85b27135e56ddfe7a216680d17b5ebf WatchSource:0}: Error finding container 304f02b250bfe29aa79e928ee9066f11d85b27135e56ddfe7a216680d17b5ebf: Status 404 returned error can't find the container with id 304f02b250bfe29aa79e928ee9066f11d85b27135e56ddfe7a216680d17b5ebf Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.385002 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.389203 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:32.889185883 +0000 UTC m=+145.810998907 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.393552 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.422044 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.486871 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.535045 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.034958146 +0000 UTC m=+145.956771180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.590942 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.591572 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.09155228 +0000 UTC m=+146.013365304 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.627628 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.637073 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.692006 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.692269 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.192235242 +0000 UTC m=+146.114048266 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.692812 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.693664 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.193645787 +0000 UTC m=+146.115458811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.716069 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.722095 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" event={"ID":"bbce412e-616a-465b-bb42-da842edb8110","Type":"ContainerStarted","Data":"6f68abfc206ae3f46958c0e3edf9d6d16057f9c152b2c78a2d813bdd3b2789a1"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.723991 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.726509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" event={"ID":"b3570207-5cb9-4481-a15a-d0bb9312a84b","Type":"ContainerStarted","Data":"37b89ef034f836300d6b35b391a2842edbfa58a86d9c055d071138a9e217debd"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.726778 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.733434 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-llx65"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.737209 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.743166 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.746703 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ff9dd"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.747503 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.751950 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7j8nw"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.752940 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.756643 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" event={"ID":"49deabd4-ebbe-4c07-bb79-105982db000a","Type":"ContainerStarted","Data":"e5688fa5e836d6693ad378718ac2d923e2bd4a94946bfe45ad6a19797eb22650"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.776782 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" event={"ID":"275484db-b3bc-4027-a1d7-a67ab3c71439","Type":"ContainerStarted","Data":"eb7554d40f82e32096a8755820f9be85d890934dacb68219ab8675407e48ee64"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.777611 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.784268 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-x6fwb"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.787116 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" event={"ID":"007a1e5a-0e90-44d1-b19d-e92154fb6a3d","Type":"ContainerStarted","Data":"8177b1b86470c47017cdac6a443fe56f399d9dcba9662f954722c87a2522aa29"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.788086 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" podStartSLOduration=126.788057215 podStartE2EDuration="2m6.788057215s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:32.778068729 +0000 UTC m=+145.699881773" watchObservedRunningTime="2026-01-20 18:32:32.788057215 +0000 UTC m=+145.709870239" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.788434 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" event={"ID":"65b41ab6-6253-4ee5-87f2-50ed05610e03","Type":"ContainerStarted","Data":"5e59468bd2c4f19972215ce899890309328c39e745f90c227579a59631ddbcbd"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.788464 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" event={"ID":"65b41ab6-6253-4ee5-87f2-50ed05610e03","Type":"ContainerStarted","Data":"304f02b250bfe29aa79e928ee9066f11d85b27135e56ddfe7a216680d17b5ebf"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.796230 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.796744 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.296721778 +0000 UTC m=+146.218534802 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.801418 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" event={"ID":"47548d0b-9447-4862-b717-9427ae40c49a","Type":"ContainerStarted","Data":"be1d4d724e34c401ca901337c0c53e9b8b1dee0ad78ae08cd1264117b04a97b2"} Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.810219 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod63fd2de1_85c4_4f01_8524_7b93c777592d.slice/crio-9c8e71a967065ef6daa0d03cbf400d36bc7d5fab70aad0abeace3be9c9b2098f WatchSource:0}: Error finding container 9c8e71a967065ef6daa0d03cbf400d36bc7d5fab70aad0abeace3be9c9b2098f: Status 404 returned error can't find the container with id 9c8e71a967065ef6daa0d03cbf400d36bc7d5fab70aad0abeace3be9c9b2098f Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.810773 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" event={"ID":"79162e32-ee8c-4fcc-8911-0f95d41cd110","Type":"ContainerStarted","Data":"6549422212c40d6874303175493a48597f52b1121344c7b0fe26e9f5f7c50976"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.810824 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" event={"ID":"79162e32-ee8c-4fcc-8911-0f95d41cd110","Type":"ContainerStarted","Data":"bef50359537c89305c7d074b823ddef915458135e59aeb4cfeff10c7cba90d87"} Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.813653 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f7552ac_b3a0_4bfa_ab3e_34e46ed83cff.slice/crio-80c6f6362197a34454eb5f369b6dd69533008fb3315bcd58067019c8230615f8 WatchSource:0}: Error finding container 80c6f6362197a34454eb5f369b6dd69533008fb3315bcd58067019c8230615f8: Status 404 returned error can't find the container with id 80c6f6362197a34454eb5f369b6dd69533008fb3315bcd58067019c8230615f8 Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.817655 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e5ac136_d46c_45e3_9a5f_548ac22fac5c.slice/crio-d0999a4bc74934e44f3c18e8a0d09b22d1db660adda3b76a8403087e384feb4f WatchSource:0}: Error finding container d0999a4bc74934e44f3c18e8a0d09b22d1db660adda3b76a8403087e384feb4f: Status 404 returned error can't find the container with id d0999a4bc74934e44f3c18e8a0d09b22d1db660adda3b76a8403087e384feb4f Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.817907 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" event={"ID":"9d63bfb8-8ecc-43f3-8931-cc09c815c580","Type":"ContainerStarted","Data":"6e8ea7b775620d0f97028008ad9efc7b0d9cd7ffbc15b9ce92c518b5ca3c147a"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.818420 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" podStartSLOduration=126.818400952 podStartE2EDuration="2m6.818400952s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:32.801901426 +0000 UTC m=+145.723714450" watchObservedRunningTime="2026-01-20 18:32:32.818400952 +0000 UTC m=+145.740213976" Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.820871 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda27e80d9_dea2_4e87_90c8_1c69288cfa55.slice/crio-7203f4080ec4ae5054ef7c048a2acd45380056533569e36167c3d7a00eaafc0a WatchSource:0}: Error finding container 7203f4080ec4ae5054ef7c048a2acd45380056533569e36167c3d7a00eaafc0a: Status 404 returned error can't find the container with id 7203f4080ec4ae5054ef7c048a2acd45380056533569e36167c3d7a00eaafc0a Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.827462 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" event={"ID":"90ec3f02-fbee-4465-b262-28b2b475e2b9","Type":"ContainerStarted","Data":"886a0cae110cb0fdcc1f14e44c311e7eca71d1c6f9627d542047ac3dbd222b51"} Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.835455 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cee99f1_8905_4089_be36_90af1426d834.slice/crio-da0e9e39eeeca799350f677d3f9dba02ca57690f88a2fe8880092d09cd983ce3 WatchSource:0}: Error finding container da0e9e39eeeca799350f677d3f9dba02ca57690f88a2fe8880092d09cd983ce3: Status 404 returned error can't find the container with id da0e9e39eeeca799350f677d3f9dba02ca57690f88a2fe8880092d09cd983ce3 Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.846987 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-kw7s6" podStartSLOduration=126.846824523 podStartE2EDuration="2m6.846824523s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:32.818917006 +0000 UTC m=+145.740730030" watchObservedRunningTime="2026-01-20 18:32:32.846824523 +0000 UTC m=+145.768637568" Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.847374 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22f987ee_958e_41a1_8cf4_ef0da8212364.slice/crio-0a61633f2a6e70e0dc5f1bd66eb2baabd7435a330b62d6ea93cc6f35d6d2dd5a WatchSource:0}: Error finding container 0a61633f2a6e70e0dc5f1bd66eb2baabd7435a330b62d6ea93cc6f35d6d2dd5a: Status 404 returned error can't find the container with id 0a61633f2a6e70e0dc5f1bd66eb2baabd7435a330b62d6ea93cc6f35d6d2dd5a Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.847902 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86f0fde2_da58_4350_ad67_cb29a2684875.slice/crio-7a30ddafef758f61a39d679f0a9640190d7b502c5948ea382fb9d6b1f5dc8ab0 WatchSource:0}: Error finding container 7a30ddafef758f61a39d679f0a9640190d7b502c5948ea382fb9d6b1f5dc8ab0: Status 404 returned error can't find the container with id 7a30ddafef758f61a39d679f0a9640190d7b502c5948ea382fb9d6b1f5dc8ab0 Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.848469 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-n5dfl" podStartSLOduration=126.848461614 podStartE2EDuration="2m6.848461614s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:32.845670355 +0000 UTC m=+145.767483379" watchObservedRunningTime="2026-01-20 18:32:32.848461614 +0000 UTC m=+145.770274648" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.856083 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" event={"ID":"e98bf97b-784a-4a99-8eff-20e6fc687876","Type":"ContainerStarted","Data":"13f6dc0568832cade7ae3139c8359ec8edc3489647d922a3ad2b60d581bea75f"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.856187 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" event={"ID":"e98bf97b-784a-4a99-8eff-20e6fc687876","Type":"ContainerStarted","Data":"aa10bed1ca3658e404efc58b50b8aff8c7ffe8e7a9f8da7cf9295153cbc28cfc"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.856223 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" event={"ID":"e98bf97b-784a-4a99-8eff-20e6fc687876","Type":"ContainerStarted","Data":"c9fc379b8739fb28bddca6604cd4b3e92eb8915c5ef4df88ed28b9d99e73f96f"} Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.857372 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-bkbfc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.857416 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bkbfc" podUID="fec9cba4-b7cb-46ca-90a4-af0d5114fee8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.887820 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" podStartSLOduration=126.887798483 podStartE2EDuration="2m6.887798483s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:32.866055497 +0000 UTC m=+145.787868521" watchObservedRunningTime="2026-01-20 18:32:32.887798483 +0000 UTC m=+145.809611507" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.894056 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-ks6ps"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.898540 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.900308 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:32 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:32 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:32 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.900361 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.901637 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-w8vpz"] Jan 20 18:32:32 crc kubenswrapper[4773]: E0120 18:32:32.902760 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.402742182 +0000 UTC m=+146.324555416 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.903389 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-c989h" podStartSLOduration=125.903371187 podStartE2EDuration="2m5.903371187s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:32.901919521 +0000 UTC m=+145.823732545" watchObservedRunningTime="2026-01-20 18:32:32.903371187 +0000 UTC m=+145.825184201" Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.923918 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5"] Jan 20 18:32:32 crc kubenswrapper[4773]: W0120 18:32:32.926348 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5d6700e_54f1_4f09_83d7_e85f66af8c85.slice/crio-6ea0f8ae46712f4ca5b3eb60190c623898879b925b7d978e7f129afc7cd8d92c WatchSource:0}: Error finding container 6ea0f8ae46712f4ca5b3eb60190c623898879b925b7d978e7f129afc7cd8d92c: Status 404 returned error can't find the container with id 6ea0f8ae46712f4ca5b3eb60190c623898879b925b7d978e7f129afc7cd8d92c Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.962827 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.966948 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.974880 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5t8h7"] Jan 20 18:32:32 crc kubenswrapper[4773]: I0120 18:32:32.977782 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48"] Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.003200 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.004455 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.504428318 +0000 UTC m=+146.426241342 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: W0120 18:32:33.049490 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71dd00dd_f11c_43a8_b7a2_2416a1761d94.slice/crio-5a4b941418d4e64cbf1c104f12844ffbb16ec64d1fd0307c47d5bdcea3888ae8 WatchSource:0}: Error finding container 5a4b941418d4e64cbf1c104f12844ffbb16ec64d1fd0307c47d5bdcea3888ae8: Status 404 returned error can't find the container with id 5a4b941418d4e64cbf1c104f12844ffbb16ec64d1fd0307c47d5bdcea3888ae8 Jan 20 18:32:33 crc kubenswrapper[4773]: W0120 18:32:33.067416 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b703025_44fd_42d1_81fa_27ef31c9d2fb.slice/crio-0c8fcbeae6aee8c8c710131894f95c20a76a1fc9c89b7048c6ad7ad38f080abd WatchSource:0}: Error finding container 0c8fcbeae6aee8c8c710131894f95c20a76a1fc9c89b7048c6ad7ad38f080abd: Status 404 returned error can't find the container with id 0c8fcbeae6aee8c8c710131894f95c20a76a1fc9c89b7048c6ad7ad38f080abd Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.105616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.106217 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.606195907 +0000 UTC m=+146.528008931 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.207251 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.207875 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.70779144 +0000 UTC m=+146.629604464 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.209520 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.210034 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.710016306 +0000 UTC m=+146.631829330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.310188 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.310591 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.810571405 +0000 UTC m=+146.732384429 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.413222 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.413561 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:33.913546782 +0000 UTC m=+146.835359806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.516601 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.516811 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.016782377 +0000 UTC m=+146.938595391 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.517390 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.518997 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.018977192 +0000 UTC m=+146.940790216 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.620904 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.621329 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.121310704 +0000 UTC m=+147.043123728 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.731327 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.731715 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.231696765 +0000 UTC m=+147.153509789 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.832947 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.833349 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.33332875 +0000 UTC m=+147.255141774 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.883762 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" event={"ID":"007a1e5a-0e90-44d1-b19d-e92154fb6a3d","Type":"ContainerStarted","Data":"b136dfb1f23b27ec2f873c0b6f216725c1fb39f8486e5a0ea6aa300a7bc89cf5"} Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.904072 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:33 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:33 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:33 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.904142 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.907990 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" event={"ID":"19da63c0-7e43-4bb0-a8fb-590722ea7cf2","Type":"ContainerStarted","Data":"7417bf00db4e4f95d66c7d334c5b590c0e25dfeb28f418d4ac9d15eea500769d"} Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.908773 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" event={"ID":"19da63c0-7e43-4bb0-a8fb-590722ea7cf2","Type":"ContainerStarted","Data":"13f2bc32e5beb9b77bdf1af9461f914ea5370008a6fb39fff9aacd066ab3cb85"} Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.934881 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:33 crc kubenswrapper[4773]: E0120 18:32:33.935288 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.435272893 +0000 UTC m=+147.357085917 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.951310 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" event={"ID":"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c","Type":"ContainerStarted","Data":"ecc3b1f32a305d38b9fa2b15e94a8361bade9cbfbe54907a33fc42929d5a22b5"} Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.955572 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" event={"ID":"22f987ee-958e-41a1-8cf4-ef0da8212364","Type":"ContainerStarted","Data":"0a61633f2a6e70e0dc5f1bd66eb2baabd7435a330b62d6ea93cc6f35d6d2dd5a"} Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.958704 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" podStartSLOduration=127.95867562 podStartE2EDuration="2m7.95867562s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:33.911519947 +0000 UTC m=+146.833332981" watchObservedRunningTime="2026-01-20 18:32:33.95867562 +0000 UTC m=+146.880488644" Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.961470 4773 generic.go:334] "Generic (PLEG): container finished" podID="b3570207-5cb9-4481-a15a-d0bb9312a84b" containerID="acc62163aa86828a85c83f6cde38f155ecb7db429189e69c347af4d13ac3334c" exitCode=0 Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.961841 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" event={"ID":"b3570207-5cb9-4481-a15a-d0bb9312a84b","Type":"ContainerDied","Data":"acc62163aa86828a85c83f6cde38f155ecb7db429189e69c347af4d13ac3334c"} Jan 20 18:32:33 crc kubenswrapper[4773]: I0120 18:32:33.974223 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5t8h7" event={"ID":"2b703025-44fd-42d1-81fa-27ef31c9d2fb","Type":"ContainerStarted","Data":"0c8fcbeae6aee8c8c710131894f95c20a76a1fc9c89b7048c6ad7ad38f080abd"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:33.998252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" event={"ID":"a27e80d9-dea2-4e87-90c8-1c69288cfa55","Type":"ContainerStarted","Data":"4560d5b5bd62f44e40075e15bc1aca9675032e7a34a4262a64615248a77a4c0e"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:33.998315 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" event={"ID":"a27e80d9-dea2-4e87-90c8-1c69288cfa55","Type":"ContainerStarted","Data":"7203f4080ec4ae5054ef7c048a2acd45380056533569e36167c3d7a00eaafc0a"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.026201 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-87l5s" podStartSLOduration=127.026181294 podStartE2EDuration="2m7.026181294s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:33.961196533 +0000 UTC m=+146.883009557" watchObservedRunningTime="2026-01-20 18:32:34.026181294 +0000 UTC m=+146.947994318" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.031791 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" event={"ID":"47548d0b-9447-4862-b717-9427ae40c49a","Type":"ContainerStarted","Data":"d3711cb68365d51583974da9debc837a3f2d76cc1d5e41a6590b6dd5f3ca9fcd"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.031841 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" event={"ID":"47548d0b-9447-4862-b717-9427ae40c49a","Type":"ContainerStarted","Data":"9d375a9b6f2d15244e273f1939e83a40f6446af9f37f4c05cede485027c4a22a"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.035735 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.038415 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.538391975 +0000 UTC m=+147.460204999 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.060650 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" event={"ID":"c5d6700e-54f1-4f09-83d7-e85f66af8c85","Type":"ContainerStarted","Data":"6ea0f8ae46712f4ca5b3eb60190c623898879b925b7d978e7f129afc7cd8d92c"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.081310 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-pfxs9" podStartSLOduration=128.081291242 podStartE2EDuration="2m8.081291242s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.08118353 +0000 UTC m=+147.002996554" watchObservedRunningTime="2026-01-20 18:32:34.081291242 +0000 UTC m=+147.003104266" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.111839 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" event={"ID":"deccf4fe-9230-4e96-b16c-a2ed0d2235a7","Type":"ContainerStarted","Data":"fbbc916eabc58535725d8bb371bf1acc9123d4e5db6dbab70a5320c438b7a02e"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.140403 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.140756 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.640740658 +0000 UTC m=+147.562553692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.161303 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" event={"ID":"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d","Type":"ContainerStarted","Data":"3d689f35c459df7616e7c28634c131829324f5334f5fd88df6ba44ccf51bd89b"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.161367 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" event={"ID":"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d","Type":"ContainerStarted","Data":"f2e93ed028afe2ff83347e9ba0da520844ee9c4df8c7f334a9a10ce40fff9779"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.205819 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" event={"ID":"14f80aee-1c1f-4beb-a280-3ac021e920c9","Type":"ContainerStarted","Data":"6a02e35c63a3c3855177ab1c0f0237b22cd314d673e1879ed7f5bdc7a6515d44"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.229331 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" event={"ID":"1e5ac136-d46c-45e3-9a5f-548ac22fac5c","Type":"ContainerStarted","Data":"c901060e5d78a348e16c6ce46f203d3c42ba5fe26d5799a1170a9b4307b0a0a5"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.229390 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" event={"ID":"1e5ac136-d46c-45e3-9a5f-548ac22fac5c","Type":"ContainerStarted","Data":"d0999a4bc74934e44f3c18e8a0d09b22d1db660adda3b76a8403087e384feb4f"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.241324 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.241577 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.242179 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.742139717 +0000 UTC m=+147.663952741 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.252846 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-857hw" podStartSLOduration=127.252826751 podStartE2EDuration="2m7.252826751s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.252402221 +0000 UTC m=+147.174215245" watchObservedRunningTime="2026-01-20 18:32:34.252826751 +0000 UTC m=+147.174639775" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.262987 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.265734 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" event={"ID":"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85","Type":"ContainerStarted","Data":"f4d91eb42c30324decc0123b0752b77625e1bfc343e356223cf0e111b47451d8"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.339254 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" event={"ID":"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff","Type":"ContainerStarted","Data":"80c6f6362197a34454eb5f369b6dd69533008fb3315bcd58067019c8230615f8"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.347160 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.347235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.347952 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.347989 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.350590 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.8505688 +0000 UTC m=+147.772381824 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.351383 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.363165 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.380724 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" podStartSLOduration=128.380705213 podStartE2EDuration="2m8.380705213s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.380478757 +0000 UTC m=+147.302291791" watchObservedRunningTime="2026-01-20 18:32:34.380705213 +0000 UTC m=+147.302518237" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.386269 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.399346 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" event={"ID":"9cee99f1-8905-4089-be36-90af1426d834","Type":"ContainerStarted","Data":"013f38fc18d5f7cfc5471d629627ae4858db35adfd2628717ed1fbec88115930"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.399397 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" event={"ID":"9cee99f1-8905-4089-be36-90af1426d834","Type":"ContainerStarted","Data":"da0e9e39eeeca799350f677d3f9dba02ca57690f88a2fe8880092d09cd983ce3"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.425989 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" podStartSLOduration=127.425967248 podStartE2EDuration="2m7.425967248s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.424544563 +0000 UTC m=+147.346357587" watchObservedRunningTime="2026-01-20 18:32:34.425967248 +0000 UTC m=+147.347780272" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.435252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" event={"ID":"90ec3f02-fbee-4465-b262-28b2b475e2b9","Type":"ContainerStarted","Data":"ec69934220ed5dec49af1bb5081abe0d905cbf101311d24364907de23320b3ec"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.437625 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.440950 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-kpq6z" event={"ID":"9d63bfb8-8ecc-43f3-8931-cc09c815c580","Type":"ContainerStarted","Data":"5c46838bf621378464901a678b09adcd44bc559135c828da641d2d0f7915bd1f"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.450538 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.455442 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.455551 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.955529378 +0000 UTC m=+147.877342402 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.455460 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.455980 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.457663 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:34.957648509 +0000 UTC m=+147.879461533 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.458772 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.473052 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lqvwq" podStartSLOduration=127.473027518 podStartE2EDuration="2m7.473027518s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.468731353 +0000 UTC m=+147.390544397" watchObservedRunningTime="2026-01-20 18:32:34.473027518 +0000 UTC m=+147.394840532" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.474873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" event={"ID":"bbce412e-616a-465b-bb42-da842edb8110","Type":"ContainerStarted","Data":"9e9fb02fd7d190cfc9b3fc46ab9c3917f121fef946756245ad1e1fd86d322de3"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.514646 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" event={"ID":"71dd00dd-f11c-43a8-b7a2-2416a1761d94","Type":"ContainerStarted","Data":"5a4b941418d4e64cbf1c104f12844ffbb16ec64d1fd0307c47d5bdcea3888ae8"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.554428 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7j8nw" event={"ID":"86f0fde2-da58-4350-ad67-cb29a2684875","Type":"ContainerStarted","Data":"d5665263fb63f8073fee27a66d95411966848e896fb3af208c72940be2d28d3d"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.554855 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7j8nw" event={"ID":"86f0fde2-da58-4350-ad67-cb29a2684875","Type":"ContainerStarted","Data":"7a30ddafef758f61a39d679f0a9640190d7b502c5948ea382fb9d6b1f5dc8ab0"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.558679 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.559082 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.059043569 +0000 UTC m=+147.980856583 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.559395 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.563828 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.063819067 +0000 UTC m=+147.985632091 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.590466 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" event={"ID":"63fd2de1-85c4-4f01-8524-7b93c777592d","Type":"ContainerStarted","Data":"9c8e71a967065ef6daa0d03cbf400d36bc7d5fab70aad0abeace3be9c9b2098f"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.596228 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" event={"ID":"027ba59d-f4ba-430f-af60-a7f293dd2052","Type":"ContainerStarted","Data":"ffbfbf1e74220cef3503469c917219539f335187a85eab929928a5894d7d61b9"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.612781 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-2wmrt" podStartSLOduration=128.612760563 podStartE2EDuration="2m8.612760563s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.55826452 +0000 UTC m=+147.480077544" watchObservedRunningTime="2026-01-20 18:32:34.612760563 +0000 UTC m=+147.534573587" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.639633 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" event={"ID":"12a1e676-da4c-46d2-a8f6-11dedde983fc","Type":"ContainerStarted","Data":"6fb49a54bbe3a1f2afde080ae3bccd7e3d22704d369a1b735706e91e7a522ef8"} Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.651968 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fcrzv" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.657877 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" podStartSLOduration=127.657841304 podStartE2EDuration="2m7.657841304s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.61709099 +0000 UTC m=+147.538904014" watchObservedRunningTime="2026-01-20 18:32:34.657841304 +0000 UTC m=+147.579654328" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.659349 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" podStartSLOduration=127.659343561 podStartE2EDuration="2m7.659343561s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.658764087 +0000 UTC m=+147.580577111" watchObservedRunningTime="2026-01-20 18:32:34.659343561 +0000 UTC m=+147.581156585" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.660722 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.662101 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.162086859 +0000 UTC m=+148.083899883 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.686215 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.722694 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" podStartSLOduration=128.722652831 podStartE2EDuration="2m8.722652831s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.719730579 +0000 UTC m=+147.641543613" watchObservedRunningTime="2026-01-20 18:32:34.722652831 +0000 UTC m=+147.644465865" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.766977 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.769052 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.269038826 +0000 UTC m=+148.190851850 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.786988 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" podStartSLOduration=128.786969487 podStartE2EDuration="2m8.786969487s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:34.782609059 +0000 UTC m=+147.704422083" watchObservedRunningTime="2026-01-20 18:32:34.786969487 +0000 UTC m=+147.708782511" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.877550 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.878196 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.378163285 +0000 UTC m=+148.299976309 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.878290 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.878990 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.378970645 +0000 UTC m=+148.300783669 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.905654 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:34 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:34 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:34 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.905720 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.980680 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.980986 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.480949759 +0000 UTC m=+148.402762783 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:34 crc kubenswrapper[4773]: I0120 18:32:34.981475 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:34 crc kubenswrapper[4773]: E0120 18:32:34.981921 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.481905923 +0000 UTC m=+148.403718947 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.088462 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.088974 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.588954921 +0000 UTC m=+148.510767945 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: W0120 18:32:35.174318 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-6728e5fd4c4fec85a7832eccc65a31fd28432cfc0411ad3e2574d53e5e93047a WatchSource:0}: Error finding container 6728e5fd4c4fec85a7832eccc65a31fd28432cfc0411ad3e2574d53e5e93047a: Status 404 returned error can't find the container with id 6728e5fd4c4fec85a7832eccc65a31fd28432cfc0411ad3e2574d53e5e93047a Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.191731 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.192101 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.692085003 +0000 UTC m=+148.613898027 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.292523 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.292813 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.792795816 +0000 UTC m=+148.714608840 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.397638 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.398143 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:35.898123412 +0000 UTC m=+148.819936436 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.501291 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.501979 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.001960081 +0000 UTC m=+148.923773105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.603174 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.603572 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.103558216 +0000 UTC m=+149.025371240 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.656886 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" event={"ID":"c5d6700e-54f1-4f09-83d7-e85f66af8c85","Type":"ContainerStarted","Data":"1bb487111a8dd96d9bb7d6357c49ebcd2c07dc95ed7a149d32cf268ec426f1f7"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.659880 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" event={"ID":"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85","Type":"ContainerStarted","Data":"cb0111d26068d7530fc1ab48948aef101d8e352e9c86e33d1e7d039ec97d0ab3"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.660294 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.662237 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ff9dd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.662347 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.664689 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" event={"ID":"22f987ee-958e-41a1-8cf4-ef0da8212364","Type":"ContainerStarted","Data":"3a4e2e7547bbac8361b30789e0bf6f41f905e9306b78f4aeb9f697d02916a7d3"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.666363 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.681499 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-6bf74" event={"ID":"63fd2de1-85c4-4f01-8524-7b93c777592d","Type":"ContainerStarted","Data":"a58107d7046025e68c36fd3ac794e389787ab76dd3b2ba2a71da22a733fd9014"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.705618 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" podStartSLOduration=128.705594 podStartE2EDuration="2m8.705594s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.705051937 +0000 UTC m=+148.626864951" watchObservedRunningTime="2026-01-20 18:32:35.705594 +0000 UTC m=+148.627407024" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.705683 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.706225 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.206186946 +0000 UTC m=+149.127999970 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.706356 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.707123 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.207114789 +0000 UTC m=+149.128927813 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.707680 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-ks6ps" event={"ID":"027ba59d-f4ba-430f-af60-a7f293dd2052","Type":"ContainerStarted","Data":"6be2580705d35127eefa7d98a68f1ae97b2c5c98239f0b1de2031188d3ede215"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.710213 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.716060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-v6wpc" event={"ID":"9f7552ac-b3a0-4bfa-ab3e-34e46ed83cff","Type":"ContainerStarted","Data":"235167a6baf46817b565ac809735811236307a81b400f55729f65b41a34149aa"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.726782 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"8527837d56e2e0feade0b967500c559dd5efbf8f453f98252ae068da6dc713b6"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.727081 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"6728e5fd4c4fec85a7832eccc65a31fd28432cfc0411ad3e2574d53e5e93047a"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.731893 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-snlnq" event={"ID":"9cee99f1-8905-4089-be36-90af1426d834","Type":"ContainerStarted","Data":"e9f8a0c11637d7c7a68f1584b29bc6f9f43bbba40be0cb8895d09defdf1debc3"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.732601 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-mrfcm" podStartSLOduration=128.732586187 podStartE2EDuration="2m8.732586187s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.729992712 +0000 UTC m=+148.651805746" watchObservedRunningTime="2026-01-20 18:32:35.732586187 +0000 UTC m=+148.654399211" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.748678 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" event={"ID":"70a421fb-8ee0-4365-b4d3-c8ddd6f6c01d","Type":"ContainerStarted","Data":"3e690d88cdf9ecc5c7b10d1726ebdfbdf7a54c4b85cd9d25d0a921954ec5b160"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.757337 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-dfqb5" event={"ID":"71dd00dd-f11c-43a8-b7a2-2416a1761d94","Type":"ContainerStarted","Data":"8a7e5f6ba35040f67099c393796e19b5f158800e1430665147c67fc1841f31d7"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.760990 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" event={"ID":"b3570207-5cb9-4481-a15a-d0bb9312a84b","Type":"ContainerStarted","Data":"284a40c07863c1fc0a978e29e9a355f06ba6da0e33de4a144c16fdda2dfcd5e6"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.764666 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5t8h7" event={"ID":"2b703025-44fd-42d1-81fa-27ef31c9d2fb","Type":"ContainerStarted","Data":"544ce484c16c23ac90d0d3e84093e1bddda3103b377fea1064e14541d2070f47"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.774393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" event={"ID":"14f80aee-1c1f-4beb-a280-3ac021e920c9","Type":"ContainerStarted","Data":"88c0c11d7a3e0537e2ce0b4862fb7565338b9de4fca992b73522f859e5ea0566"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.787108 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.798199 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.798590 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" event={"ID":"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c","Type":"ContainerStarted","Data":"0933c284d7c611f700bc01fc4a21c3a53a009d967aeaa6152bff05e4ff861b09"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.798629 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" event={"ID":"aab7784b-df99-4fd2-b2ea-3d2f6cdb098c","Type":"ContainerStarted","Data":"4b88e3a17718a3484046c7281f9f6b73a5c0f1414fc06a97c38605c5dc305f8f"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.807318 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.808366 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.308328113 +0000 UTC m=+149.230141137 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.825557 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" event={"ID":"a27e80d9-dea2-4e87-90c8-1c69288cfa55","Type":"ContainerStarted","Data":"2e0601da8fcd6be67b257f2e252e16ebc95579345b6166da1bcbacd948648b69"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.826585 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.839066 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5t8h7" podStartSLOduration=10.839048091 podStartE2EDuration="10.839048091s" podCreationTimestamp="2026-01-20 18:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.805788181 +0000 UTC m=+148.727601205" watchObservedRunningTime="2026-01-20 18:32:35.839048091 +0000 UTC m=+148.760861115" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.841063 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" podStartSLOduration=128.84105722 podStartE2EDuration="2m8.84105722s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.838271251 +0000 UTC m=+148.760084285" watchObservedRunningTime="2026-01-20 18:32:35.84105722 +0000 UTC m=+148.762870244" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.848311 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f8cfc0915963aab530743e7ca948d4ce12478ca7b058865f0c1c7d7cfde75c13"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.848378 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"f51e9c50643f110f0fb0190747fe3a71b30ac2564baedd3a85a4912b1ba90dcc"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.871466 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7j8nw" event={"ID":"86f0fde2-da58-4350-ad67-cb29a2684875","Type":"ContainerStarted","Data":"73acc141bfeded81d28a19fa832921c535eaa5b4ccb63932fa12854fe299f8fe"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.872217 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.880288 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-cfslv" podStartSLOduration=128.880271396 podStartE2EDuration="2m8.880271396s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.879503958 +0000 UTC m=+148.801317002" watchObservedRunningTime="2026-01-20 18:32:35.880271396 +0000 UTC m=+148.802084420" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.887697 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-425qm" event={"ID":"12a1e676-da4c-46d2-a8f6-11dedde983fc","Type":"ContainerStarted","Data":"ecaf9f598aa7773a51c68cc4550f1673d164814d391dfbca7c764660ba07de4d"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.910583 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.911909 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:35 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:35 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:35 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.911985 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.912074 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"3799c786fbbb3d35b3d20344ab7186c953095791ebe703fac2ed60f73ba2d0d2"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.912987 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:32:35 crc kubenswrapper[4773]: E0120 18:32:35.913705 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.41369058 +0000 UTC m=+149.335503604 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.915993 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-6qw48" podStartSLOduration=128.915975407 podStartE2EDuration="2m8.915975407s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.913046485 +0000 UTC m=+148.834859509" watchObservedRunningTime="2026-01-20 18:32:35.915975407 +0000 UTC m=+148.837788431" Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.939697 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" event={"ID":"deccf4fe-9230-4e96-b16c-a2ed0d2235a7","Type":"ContainerStarted","Data":"ea57fe328031a1378494cf1db170040b0ff2e64e906493eda1cfe756eb2d3ac3"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.939741 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" event={"ID":"deccf4fe-9230-4e96-b16c-a2ed0d2235a7","Type":"ContainerStarted","Data":"0e9ac154fb60cce9dec38ed497643a99c7f37c7e3a84d4ae23f36c220acc19cf"} Jan 20 18:32:35 crc kubenswrapper[4773]: I0120 18:32:35.982677 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" podStartSLOduration=128.98265652 podStartE2EDuration="2m8.98265652s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:35.96035075 +0000 UTC m=+148.882163774" watchObservedRunningTime="2026-01-20 18:32:35.98265652 +0000 UTC m=+148.904469544" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.014661 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.017498 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.516964096 +0000 UTC m=+149.438777310 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.127711 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.128072 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.628058204 +0000 UTC m=+149.549871228 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.131549 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7j8nw" podStartSLOduration=11.13151253 podStartE2EDuration="11.13151253s" podCreationTimestamp="2026-01-20 18:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:36.112803798 +0000 UTC m=+149.034616822" watchObservedRunningTime="2026-01-20 18:32:36.13151253 +0000 UTC m=+149.053325554" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.133677 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-llx65" podStartSLOduration=130.133661612 podStartE2EDuration="2m10.133661612s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:36.087068364 +0000 UTC m=+149.008881388" watchObservedRunningTime="2026-01-20 18:32:36.133661612 +0000 UTC m=+149.055474646" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.143628 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-x6fwb" podStartSLOduration=129.143600727 podStartE2EDuration="2m9.143600727s" podCreationTimestamp="2026-01-20 18:30:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:36.141020264 +0000 UTC m=+149.062833288" watchObservedRunningTime="2026-01-20 18:32:36.143600727 +0000 UTC m=+149.065413751" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.228584 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.228874 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.728838758 +0000 UTC m=+149.650651782 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.228985 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.229493 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.729482065 +0000 UTC m=+149.651295089 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.235398 4773 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.330688 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.330888 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.830850043 +0000 UTC m=+149.752663067 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.331073 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.331419 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.831411397 +0000 UTC m=+149.753224421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.432076 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.432213 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.932191351 +0000 UTC m=+149.854004375 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.433236 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.433719 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:36.933697938 +0000 UTC m=+149.855510962 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.535173 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.535387 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:37.035352134 +0000 UTC m=+149.957165158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.535448 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.535750 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-20 18:32:37.035742253 +0000 UTC m=+149.957555277 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-kr4zh" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.636834 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: E0120 18:32:36.637339 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-20 18:32:37.137315247 +0000 UTC m=+150.059128271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.721378 4773 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-20T18:32:36.235420271Z","Handler":null,"Name":""} Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.724710 4773 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.724780 4773 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.738908 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.743750 4773 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.743834 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.761834 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2qdpl"] Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.766669 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.776209 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.790810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-kr4zh\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.805307 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2qdpl"] Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.841062 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.841422 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hngfz\" (UniqueName: \"kubernetes.io/projected/8923f3c0-0b58-4097-aa87-9df34cf90e41-kube-api-access-hngfz\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.841523 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-utilities\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.841557 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-catalog-content\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.899732 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:36 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:36 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:36 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.899800 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.936769 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.942641 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-catalog-content\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.942719 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hngfz\" (UniqueName: \"kubernetes.io/projected/8923f3c0-0b58-4097-aa87-9df34cf90e41-kube-api-access-hngfz\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.942808 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-utilities\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.943392 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-catalog-content\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.952638 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v75d6"] Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.953906 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.955333 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-utilities\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.971800 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.980184 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v75d6"] Jan 20 18:32:36 crc kubenswrapper[4773]: I0120 18:32:36.988067 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hngfz\" (UniqueName: \"kubernetes.io/projected/8923f3c0-0b58-4097-aa87-9df34cf90e41-kube-api-access-hngfz\") pod \"community-operators-2qdpl\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.006198 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" event={"ID":"c5d6700e-54f1-4f09-83d7-e85f66af8c85","Type":"ContainerStarted","Data":"77e12e08989d9253b8a6267df61c0f6a5a727483d295c1cf467ac4bbadb22f02"} Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.006278 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" event={"ID":"c5d6700e-54f1-4f09-83d7-e85f66af8c85","Type":"ContainerStarted","Data":"080c05ea441ebe9478b12dd279af6b3d5baaaf1a12a7630d123a1f146fd52f26"} Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.007787 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"d3264e96f8657eb59542a703556e4f7626c7a9b9d060f39da3fe958c6025cc4d"} Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.013073 4773 generic.go:334] "Generic (PLEG): container finished" podID="007a1e5a-0e90-44d1-b19d-e92154fb6a3d" containerID="b136dfb1f23b27ec2f873c0b6f216725c1fb39f8486e5a0ea6aa300a7bc89cf5" exitCode=0 Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.013424 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" event={"ID":"007a1e5a-0e90-44d1-b19d-e92154fb6a3d","Type":"ContainerDied","Data":"b136dfb1f23b27ec2f873c0b6f216725c1fb39f8486e5a0ea6aa300a7bc89cf5"} Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.014420 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-ff9dd container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" start-of-body= Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.014474 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.32:8080/healthz\": dial tcp 10.217.0.32:8080: connect: connection refused" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.017817 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.044670 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbj8l\" (UniqueName: \"kubernetes.io/projected/074f367d-7a48-4046-a679-9a2d38111b8a-kube-api-access-fbj8l\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.044753 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-utilities\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.044869 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-catalog-content\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.063782 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.081627 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" podStartSLOduration=12.081603298 podStartE2EDuration="12.081603298s" podCreationTimestamp="2026-01-20 18:32:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:37.05729286 +0000 UTC m=+149.979105904" watchObservedRunningTime="2026-01-20 18:32:37.081603298 +0000 UTC m=+150.003416322" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.128534 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.141559 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.141599 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.146437 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbj8l\" (UniqueName: \"kubernetes.io/projected/074f367d-7a48-4046-a679-9a2d38111b8a-kube-api-access-fbj8l\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.146580 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-utilities\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.146719 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-catalog-content\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.149079 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-utilities\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.152181 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-catalog-content\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.153061 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6lqws"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.155035 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.155144 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.167313 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6lqws"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.181060 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbj8l\" (UniqueName: \"kubernetes.io/projected/074f367d-7a48-4046-a679-9a2d38111b8a-kube-api-access-fbj8l\") pod \"certified-operators-v75d6\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.248609 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-utilities\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.248754 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkldm\" (UniqueName: \"kubernetes.io/projected/a6759422-151d-4228-b7c7-848c3008fb52-kube-api-access-pkldm\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.248846 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-catalog-content\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.312369 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.348603 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-c8jjn"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.350594 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkldm\" (UniqueName: \"kubernetes.io/projected/a6759422-151d-4228-b7c7-848c3008fb52-kube-api-access-pkldm\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.350664 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-catalog-content\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.350738 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-utilities\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.351374 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-catalog-content\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.351492 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-utilities\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.354453 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.369193 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c8jjn"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.373412 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kr4zh"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.381323 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkldm\" (UniqueName: \"kubernetes.io/projected/a6759422-151d-4228-b7c7-848c3008fb52-kube-api-access-pkldm\") pod \"community-operators-6lqws\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.457325 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt4kw\" (UniqueName: \"kubernetes.io/projected/48e32f25-29eb-4ef0-892b-0da316c47e3d-kube-api-access-gt4kw\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.457428 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-utilities\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.457859 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-catalog-content\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.483642 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.532654 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2qdpl"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.558839 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-catalog-content\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.558890 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt4kw\" (UniqueName: \"kubernetes.io/projected/48e32f25-29eb-4ef0-892b-0da316c47e3d-kube-api-access-gt4kw\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.558939 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-utilities\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.559406 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-utilities\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.559617 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-catalog-content\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.571156 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.577022 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt4kw\" (UniqueName: \"kubernetes.io/projected/48e32f25-29eb-4ef0-892b-0da316c47e3d-kube-api-access-gt4kw\") pod \"certified-operators-c8jjn\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.748342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.807273 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6lqws"] Jan 20 18:32:37 crc kubenswrapper[4773]: W0120 18:32:37.820754 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6759422_151d_4228_b7c7_848c3008fb52.slice/crio-c471e82ddefc652e5f04653639951233ffc24c82258453c65af9fd4352ec0b51 WatchSource:0}: Error finding container c471e82ddefc652e5f04653639951233ffc24c82258453c65af9fd4352ec0b51: Status 404 returned error can't find the container with id c471e82ddefc652e5f04653639951233ffc24c82258453c65af9fd4352ec0b51 Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.838096 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v75d6"] Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.902639 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:37 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:37 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:37 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:37 crc kubenswrapper[4773]: I0120 18:32:37.902712 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.002429 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-c8jjn"] Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.024862 4773 generic.go:334] "Generic (PLEG): container finished" podID="074f367d-7a48-4046-a679-9a2d38111b8a" containerID="1f220958033235a6f9fe2c2b2ebf17e5764f53dc8958a6ac265dd5f47e11eb7e" exitCode=0 Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.024968 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75d6" event={"ID":"074f367d-7a48-4046-a679-9a2d38111b8a","Type":"ContainerDied","Data":"1f220958033235a6f9fe2c2b2ebf17e5764f53dc8958a6ac265dd5f47e11eb7e"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.025002 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75d6" event={"ID":"074f367d-7a48-4046-a679-9a2d38111b8a","Type":"ContainerStarted","Data":"f57839f90df36cf23471ecda170b0c2440316e257ee6cca520a35c728d5b16de"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.026685 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.028135 4773 generic.go:334] "Generic (PLEG): container finished" podID="a6759422-151d-4228-b7c7-848c3008fb52" containerID="ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d" exitCode=0 Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.028180 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerDied","Data":"ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.028197 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerStarted","Data":"c471e82ddefc652e5f04653639951233ffc24c82258453c65af9fd4352ec0b51"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.033800 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-w8vpz" event={"ID":"c5d6700e-54f1-4f09-83d7-e85f66af8c85","Type":"ContainerStarted","Data":"a9898240f848cc033229fced45abd6a80768873178710d5a8fc7ce6b46376824"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.038171 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" event={"ID":"f751520b-bf3d-4226-8850-4b3346c43a6f","Type":"ContainerStarted","Data":"cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.038239 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" event={"ID":"f751520b-bf3d-4226-8850-4b3346c43a6f","Type":"ContainerStarted","Data":"4717aabd05ca8421c098accb226b89152753529be1fa867b484287b5c5a81ae7"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.038469 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.043341 4773 generic.go:334] "Generic (PLEG): container finished" podID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerID="2fb65d95b1dd9e1def202549dcf0c536be64e92ad04a8874773fb7a70a7be1b9" exitCode=0 Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.044885 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerDied","Data":"2fb65d95b1dd9e1def202549dcf0c536be64e92ad04a8874773fb7a70a7be1b9"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.044910 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerStarted","Data":"47261c669f243247e3360eb031003a9925a21eac9889414fbe72f5ed85389a71"} Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.056067 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-7zslm" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.082513 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" podStartSLOduration=132.08249441 podStartE2EDuration="2m12.08249441s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:38.07920919 +0000 UTC m=+151.001022214" watchObservedRunningTime="2026-01-20 18:32:38.08249441 +0000 UTC m=+151.004307434" Jan 20 18:32:38 crc kubenswrapper[4773]: W0120 18:32:38.105426 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48e32f25_29eb_4ef0_892b_0da316c47e3d.slice/crio-6b054655660c22424d85e72874df79577e584d37d89d639cfdd49a06b904b839 WatchSource:0}: Error finding container 6b054655660c22424d85e72874df79577e584d37d89d639cfdd49a06b904b839: Status 404 returned error can't find the container with id 6b054655660c22424d85e72874df79577e584d37d89d639cfdd49a06b904b839 Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.266787 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-bkbfc container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.267167 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-bkbfc" podUID="fec9cba4-b7cb-46ca-90a4-af0d5114fee8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.270682 4773 patch_prober.go:28] interesting pod/downloads-7954f5f757-bkbfc container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" start-of-body= Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.270712 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-bkbfc" podUID="fec9cba4-b7cb-46ca-90a4-af0d5114fee8" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.21:8080/\": dial tcp 10.217.0.21:8080: connect: connection refused" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.442345 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.577426 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-config-volume\") pod \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.577514 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-secret-volume\") pod \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.577540 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxslf\" (UniqueName: \"kubernetes.io/projected/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-kube-api-access-zxslf\") pod \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\" (UID: \"007a1e5a-0e90-44d1-b19d-e92154fb6a3d\") " Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.579941 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-config-volume" (OuterVolumeSpecName: "config-volume") pod "007a1e5a-0e90-44d1-b19d-e92154fb6a3d" (UID: "007a1e5a-0e90-44d1-b19d-e92154fb6a3d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.583808 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "007a1e5a-0e90-44d1-b19d-e92154fb6a3d" (UID: "007a1e5a-0e90-44d1-b19d-e92154fb6a3d"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.583924 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-kube-api-access-zxslf" (OuterVolumeSpecName: "kube-api-access-zxslf") pod "007a1e5a-0e90-44d1-b19d-e92154fb6a3d" (UID: "007a1e5a-0e90-44d1-b19d-e92154fb6a3d"). InnerVolumeSpecName "kube-api-access-zxslf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.679610 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.679655 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.679674 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxslf\" (UniqueName: \"kubernetes.io/projected/007a1e5a-0e90-44d1-b19d-e92154fb6a3d-kube-api-access-zxslf\") on node \"crc\" DevicePath \"\"" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.901045 4773 patch_prober.go:28] interesting pod/router-default-5444994796-x95ml container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 20 18:32:38 crc kubenswrapper[4773]: [-]has-synced failed: reason withheld Jan 20 18:32:38 crc kubenswrapper[4773]: [+]process-running ok Jan 20 18:32:38 crc kubenswrapper[4773]: healthz check failed Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.901107 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-x95ml" podUID="00a9d467-1154-4eae-b1e5-19dfbb214a80" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.941580 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-w4bcd"] Jan 20 18:32:38 crc kubenswrapper[4773]: E0120 18:32:38.942026 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="007a1e5a-0e90-44d1-b19d-e92154fb6a3d" containerName="collect-profiles" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.942040 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="007a1e5a-0e90-44d1-b19d-e92154fb6a3d" containerName="collect-profiles" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.942197 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="007a1e5a-0e90-44d1-b19d-e92154fb6a3d" containerName="collect-profiles" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.943138 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.946103 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 20 18:32:38 crc kubenswrapper[4773]: I0120 18:32:38.957567 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4bcd"] Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.050767 4773 generic.go:334] "Generic (PLEG): container finished" podID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerID="40513f62f7dee095227f397c99bc022e2d2616f999a31014e1b18abdc02ed257" exitCode=0 Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.050841 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8jjn" event={"ID":"48e32f25-29eb-4ef0-892b-0da316c47e3d","Type":"ContainerDied","Data":"40513f62f7dee095227f397c99bc022e2d2616f999a31014e1b18abdc02ed257"} Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.050925 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8jjn" event={"ID":"48e32f25-29eb-4ef0-892b-0da316c47e3d","Type":"ContainerStarted","Data":"6b054655660c22424d85e72874df79577e584d37d89d639cfdd49a06b904b839"} Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.053459 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" event={"ID":"007a1e5a-0e90-44d1-b19d-e92154fb6a3d","Type":"ContainerDied","Data":"8177b1b86470c47017cdac6a443fe56f399d9dcba9662f954722c87a2522aa29"} Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.053507 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8177b1b86470c47017cdac6a443fe56f399d9dcba9662f954722c87a2522aa29" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.053514 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.085519 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dppcc\" (UniqueName: \"kubernetes.io/projected/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-kube-api-access-dppcc\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.085566 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-catalog-content\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.085591 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-utilities\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.186969 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dppcc\" (UniqueName: \"kubernetes.io/projected/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-kube-api-access-dppcc\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.187391 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-catalog-content\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.187673 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-utilities\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.188032 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-utilities\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.188027 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-catalog-content\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.213761 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dppcc\" (UniqueName: \"kubernetes.io/projected/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-kube-api-access-dppcc\") pod \"redhat-marketplace-w4bcd\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.259954 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.348835 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wmbvt"] Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.349960 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.369318 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmbvt"] Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.433196 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.434122 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.437150 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.437833 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.438143 4773 patch_prober.go:28] interesting pod/console-f9d7485db-9nh6h container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.438219 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-9nh6h" podUID="ba3736bb-3d36-4a0c-91fa-85f410849312" containerName="console" probeResult="failure" output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.458581 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.493109 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9wnc\" (UniqueName: \"kubernetes.io/projected/86872811-c0ef-45cc-949a-f88b07fca9b3-kube-api-access-s9wnc\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.493333 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-utilities\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.493389 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-catalog-content\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.595578 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9wnc\" (UniqueName: \"kubernetes.io/projected/86872811-c0ef-45cc-949a-f88b07fca9b3-kube-api-access-s9wnc\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.595681 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-utilities\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.595716 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-catalog-content\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.596285 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-utilities\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.596313 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-catalog-content\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.649983 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9wnc\" (UniqueName: \"kubernetes.io/projected/86872811-c0ef-45cc-949a-f88b07fca9b3-kube-api-access-s9wnc\") pod \"redhat-marketplace-wmbvt\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.713202 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.747597 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4bcd"] Jan 20 18:32:39 crc kubenswrapper[4773]: W0120 18:32:39.796581 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9e2e310_70dc_4eb9_94f3_2e4466e2b7d2.slice/crio-259bcd091e35535624126a0c051a63c6b1167732a3459de7fa71f75dfa10b627 WatchSource:0}: Error finding container 259bcd091e35535624126a0c051a63c6b1167732a3459de7fa71f75dfa10b627: Status 404 returned error can't find the container with id 259bcd091e35535624126a0c051a63c6b1167732a3459de7fa71f75dfa10b627 Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.895501 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.901420 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.950609 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fm4ln"] Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.951606 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.954863 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.960922 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fm4ln"] Jan 20 18:32:39 crc kubenswrapper[4773]: I0120 18:32:39.963788 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.017508 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzmpk\" (UniqueName: \"kubernetes.io/projected/e5fd624a-2fa6-4887-83e0-779057846c71-kube-api-access-vzmpk\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.017588 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-catalog-content\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.017656 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-utilities\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.076148 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4bcd" event={"ID":"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2","Type":"ContainerStarted","Data":"259bcd091e35535624126a0c051a63c6b1167732a3459de7fa71f75dfa10b627"} Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.082882 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-k2x4p" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.087455 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-x95ml" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.105148 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmbvt"] Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.119389 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-catalog-content\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.119504 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-utilities\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.119600 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vzmpk\" (UniqueName: \"kubernetes.io/projected/e5fd624a-2fa6-4887-83e0-779057846c71-kube-api-access-vzmpk\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.120607 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-utilities\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.120614 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-catalog-content\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.152275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzmpk\" (UniqueName: \"kubernetes.io/projected/e5fd624a-2fa6-4887-83e0-779057846c71-kube-api-access-vzmpk\") pod \"redhat-operators-fm4ln\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: W0120 18:32:40.198081 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86872811_c0ef_45cc_949a_f88b07fca9b3.slice/crio-a50cda98715d09fbf90414a684934d7ed74aa3bda7711266d5d7491e5d03b79c WatchSource:0}: Error finding container a50cda98715d09fbf90414a684934d7ed74aa3bda7711266d5d7491e5d03b79c: Status 404 returned error can't find the container with id a50cda98715d09fbf90414a684934d7ed74aa3bda7711266d5d7491e5d03b79c Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.296290 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.363522 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kxsfk"] Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.364619 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.412890 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxsfk"] Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.445434 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txldk\" (UniqueName: \"kubernetes.io/projected/c19dbd84-8fec-4998-b2ae-65c68dee6b17-kube-api-access-txldk\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.446067 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-catalog-content\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.446091 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-utilities\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.471702 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.472583 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.475133 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.481570 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.518254 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.554033 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txldk\" (UniqueName: \"kubernetes.io/projected/c19dbd84-8fec-4998-b2ae-65c68dee6b17-kube-api-access-txldk\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.554134 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-catalog-content\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.554161 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2c577af-d1a7-40f0-99be-340543255117-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.554177 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-utilities\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.554197 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2c577af-d1a7-40f0-99be-340543255117-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.556359 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-catalog-content\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.556373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-utilities\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.577298 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txldk\" (UniqueName: \"kubernetes.io/projected/c19dbd84-8fec-4998-b2ae-65c68dee6b17-kube-api-access-txldk\") pod \"redhat-operators-kxsfk\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.658113 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2c577af-d1a7-40f0-99be-340543255117-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.658206 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2c577af-d1a7-40f0-99be-340543255117-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.659484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2c577af-d1a7-40f0-99be-340543255117-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.677612 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2c577af-d1a7-40f0-99be-340543255117-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.698914 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.834816 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:40 crc kubenswrapper[4773]: I0120 18:32:40.888304 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fm4ln"] Jan 20 18:32:40 crc kubenswrapper[4773]: W0120 18:32:40.898605 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5fd624a_2fa6_4887_83e0_779057846c71.slice/crio-d8d891af7a0ae24346edc8a46d303844186b0bedf95160cce185884dab78b333 WatchSource:0}: Error finding container d8d891af7a0ae24346edc8a46d303844186b0bedf95160cce185884dab78b333: Status 404 returned error can't find the container with id d8d891af7a0ae24346edc8a46d303844186b0bedf95160cce185884dab78b333 Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.094942 4773 generic.go:334] "Generic (PLEG): container finished" podID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerID="0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545" exitCode=0 Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.095223 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmbvt" event={"ID":"86872811-c0ef-45cc-949a-f88b07fca9b3","Type":"ContainerDied","Data":"0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545"} Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.095541 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmbvt" event={"ID":"86872811-c0ef-45cc-949a-f88b07fca9b3","Type":"ContainerStarted","Data":"a50cda98715d09fbf90414a684934d7ed74aa3bda7711266d5d7491e5d03b79c"} Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.101769 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerID="ca09fdf16fb11b2b53675f7f94bd271507c5f87bdae39648e8c76e9ebf18f6ca" exitCode=0 Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.101840 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4bcd" event={"ID":"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2","Type":"ContainerDied","Data":"ca09fdf16fb11b2b53675f7f94bd271507c5f87bdae39648e8c76e9ebf18f6ca"} Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.105795 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerStarted","Data":"d8d891af7a0ae24346edc8a46d303844186b0bedf95160cce185884dab78b333"} Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.176680 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kxsfk"] Jan 20 18:32:41 crc kubenswrapper[4773]: I0120 18:32:41.444292 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 20 18:32:42 crc kubenswrapper[4773]: I0120 18:32:42.122258 4773 generic.go:334] "Generic (PLEG): container finished" podID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerID="7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3" exitCode=0 Jan 20 18:32:42 crc kubenswrapper[4773]: I0120 18:32:42.122376 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerDied","Data":"7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3"} Jan 20 18:32:42 crc kubenswrapper[4773]: I0120 18:32:42.122953 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerStarted","Data":"c69c9f7c333b88336518be53c3fff5b901d87e988f1ab9a73e541d27f61cbb78"} Jan 20 18:32:42 crc kubenswrapper[4773]: I0120 18:32:42.163504 4773 generic.go:334] "Generic (PLEG): container finished" podID="e5fd624a-2fa6-4887-83e0-779057846c71" containerID="bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee" exitCode=0 Jan 20 18:32:42 crc kubenswrapper[4773]: I0120 18:32:42.163647 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerDied","Data":"bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee"} Jan 20 18:32:42 crc kubenswrapper[4773]: I0120 18:32:42.179690 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f2c577af-d1a7-40f0-99be-340543255117","Type":"ContainerStarted","Data":"5e300ccf36d41fad9538e9de1df6b6580ae8c4a4328fbc7a160605c014450e23"} Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.193260 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f2c577af-d1a7-40f0-99be-340543255117","Type":"ContainerStarted","Data":"1813474b98755b6d4285fbe703e9f6c817d84d1be9dfe42206aaf32444a2447a"} Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.207771 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=3.207735475 podStartE2EDuration="3.207735475s" podCreationTimestamp="2026-01-20 18:32:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:32:43.204465154 +0000 UTC m=+156.126278178" watchObservedRunningTime="2026-01-20 18:32:43.207735475 +0000 UTC m=+156.129548499" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.434293 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.435338 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.438129 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.438374 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.439009 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.558955 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.559015 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.660234 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.660367 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.660382 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.680837 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:43 crc kubenswrapper[4773]: I0120 18:32:43.779464 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:44 crc kubenswrapper[4773]: I0120 18:32:44.126771 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 20 18:32:44 crc kubenswrapper[4773]: I0120 18:32:44.212870 4773 generic.go:334] "Generic (PLEG): container finished" podID="f2c577af-d1a7-40f0-99be-340543255117" containerID="1813474b98755b6d4285fbe703e9f6c817d84d1be9dfe42206aaf32444a2447a" exitCode=0 Jan 20 18:32:44 crc kubenswrapper[4773]: I0120 18:32:44.212945 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f2c577af-d1a7-40f0-99be-340543255117","Type":"ContainerDied","Data":"1813474b98755b6d4285fbe703e9f6c817d84d1be9dfe42206aaf32444a2447a"} Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.135580 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7j8nw" Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.250455 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d","Type":"ContainerStarted","Data":"e5bf3229f1758a173d843e0afb161feaad40c4593219de0d8f49226d70963644"} Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.250506 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d","Type":"ContainerStarted","Data":"67bfb3a0380f856f094f0e1a7bc5ba8821aa008bcf260d7de7488aff580b4c46"} Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.554552 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.724059 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2c577af-d1a7-40f0-99be-340543255117-kubelet-dir\") pod \"f2c577af-d1a7-40f0-99be-340543255117\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.725155 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2c577af-d1a7-40f0-99be-340543255117-kube-api-access\") pod \"f2c577af-d1a7-40f0-99be-340543255117\" (UID: \"f2c577af-d1a7-40f0-99be-340543255117\") " Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.725266 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f2c577af-d1a7-40f0-99be-340543255117-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f2c577af-d1a7-40f0-99be-340543255117" (UID: "f2c577af-d1a7-40f0-99be-340543255117"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.725786 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f2c577af-d1a7-40f0-99be-340543255117-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.738735 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2c577af-d1a7-40f0-99be-340543255117-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f2c577af-d1a7-40f0-99be-340543255117" (UID: "f2c577af-d1a7-40f0-99be-340543255117"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:32:45 crc kubenswrapper[4773]: I0120 18:32:45.832223 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f2c577af-d1a7-40f0-99be-340543255117-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:32:46 crc kubenswrapper[4773]: I0120 18:32:46.281047 4773 generic.go:334] "Generic (PLEG): container finished" podID="e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d" containerID="e5bf3229f1758a173d843e0afb161feaad40c4593219de0d8f49226d70963644" exitCode=0 Jan 20 18:32:46 crc kubenswrapper[4773]: I0120 18:32:46.281132 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d","Type":"ContainerDied","Data":"e5bf3229f1758a173d843e0afb161feaad40c4593219de0d8f49226d70963644"} Jan 20 18:32:46 crc kubenswrapper[4773]: I0120 18:32:46.289053 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 20 18:32:46 crc kubenswrapper[4773]: I0120 18:32:46.289036 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"f2c577af-d1a7-40f0-99be-340543255117","Type":"ContainerDied","Data":"5e300ccf36d41fad9538e9de1df6b6580ae8c4a4328fbc7a160605c014450e23"} Jan 20 18:32:46 crc kubenswrapper[4773]: I0120 18:32:46.289198 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e300ccf36d41fad9538e9de1df6b6580ae8c4a4328fbc7a160605c014450e23" Jan 20 18:32:48 crc kubenswrapper[4773]: I0120 18:32:48.280162 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-bkbfc" Jan 20 18:32:48 crc kubenswrapper[4773]: I0120 18:32:48.915704 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:48 crc kubenswrapper[4773]: I0120 18:32:48.937973 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3791c4b7-dcef-470d-a67e-a2c0bb004436-metrics-certs\") pod \"network-metrics-daemon-4jpbd\" (UID: \"3791c4b7-dcef-470d-a67e-a2c0bb004436\") " pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:49 crc kubenswrapper[4773]: I0120 18:32:49.064914 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-4jpbd" Jan 20 18:32:49 crc kubenswrapper[4773]: I0120 18:32:49.437340 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:49 crc kubenswrapper[4773]: I0120 18:32:49.441758 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.317979 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bjtnp"] Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.318725 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" containerID="cri-o://0a049498de4a817a942a2b87af4cddb85094398aca02919519cc51c9033fb122" gracePeriod=30 Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.345320 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz"] Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.345628 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" containerID="cri-o://bdc540ba6443004408ae797654a9bce71e44b1154765597934dc3ae831e5cade" gracePeriod=30 Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.368147 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d","Type":"ContainerDied","Data":"67bfb3a0380f856f094f0e1a7bc5ba8821aa008bcf260d7de7488aff580b4c46"} Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.368203 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67bfb3a0380f856f094f0e1a7bc5ba8821aa008bcf260d7de7488aff580b4c46" Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.396518 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.497214 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kubelet-dir\") pod \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.497263 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kube-api-access\") pod \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\" (UID: \"e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d\") " Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.497354 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d" (UID: "e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:32:54 crc kubenswrapper[4773]: I0120 18:32:54.497582 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:32:55 crc kubenswrapper[4773]: I0120 18:32:55.375191 4773 generic.go:334] "Generic (PLEG): container finished" podID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerID="bdc540ba6443004408ae797654a9bce71e44b1154765597934dc3ae831e5cade" exitCode=0 Jan 20 18:32:55 crc kubenswrapper[4773]: I0120 18:32:55.375280 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" event={"ID":"1d49ef4e-91fb-4b98-89d9-65358c718967","Type":"ContainerDied","Data":"bdc540ba6443004408ae797654a9bce71e44b1154765597934dc3ae831e5cade"} Jan 20 18:32:55 crc kubenswrapper[4773]: I0120 18:32:55.376876 4773 generic.go:334] "Generic (PLEG): container finished" podID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerID="0a049498de4a817a942a2b87af4cddb85094398aca02919519cc51c9033fb122" exitCode=0 Jan 20 18:32:55 crc kubenswrapper[4773]: I0120 18:32:55.376961 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" event={"ID":"419120bb-3f1b-4f21-adf5-ac057bd5dce6","Type":"ContainerDied","Data":"0a049498de4a817a942a2b87af4cddb85094398aca02919519cc51c9033fb122"} Jan 20 18:32:55 crc kubenswrapper[4773]: I0120 18:32:55.376965 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 20 18:32:56 crc kubenswrapper[4773]: I0120 18:32:56.861511 4773 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bjtnp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Jan 20 18:32:56 crc kubenswrapper[4773]: I0120 18:32:56.861913 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Jan 20 18:32:56 crc kubenswrapper[4773]: I0120 18:32:56.919904 4773 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gnvxz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Jan 20 18:32:56 crc kubenswrapper[4773]: I0120 18:32:56.920000 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Jan 20 18:32:57 crc kubenswrapper[4773]: I0120 18:32:57.025875 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:32:58 crc kubenswrapper[4773]: I0120 18:32:58.171081 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:32:58 crc kubenswrapper[4773]: I0120 18:32:58.171360 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:32:58 crc kubenswrapper[4773]: I0120 18:32:58.899699 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d" (UID: "e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:32:58 crc kubenswrapper[4773]: I0120 18:32:58.966815 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:03 crc kubenswrapper[4773]: E0120 18:33:03.322750 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3794576891/2\": happened during read: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 20 18:33:03 crc kubenswrapper[4773]: E0120 18:33:03.323343 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-dppcc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-w4bcd_openshift-marketplace(c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \"/var/tmp/container_images_storage3794576891/2\": happened during read: context canceled" logger="UnhandledError" Jan 20 18:33:03 crc kubenswrapper[4773]: E0120 18:33:03.325088 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: writing blob: storing blob to file \\\"/var/tmp/container_images_storage3794576891/2\\\": happened during read: context canceled\"" pod="openshift-marketplace/redhat-marketplace-w4bcd" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" Jan 20 18:33:04 crc kubenswrapper[4773]: E0120 18:33:04.833228 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-w4bcd" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" Jan 20 18:33:06 crc kubenswrapper[4773]: E0120 18:33:06.447879 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 20 18:33:06 crc kubenswrapper[4773]: E0120 18:33:06.448569 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fbj8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-v75d6_openshift-marketplace(074f367d-7a48-4046-a679-9a2d38111b8a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 20 18:33:06 crc kubenswrapper[4773]: E0120 18:33:06.449802 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-v75d6" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" Jan 20 18:33:07 crc kubenswrapper[4773]: I0120 18:33:07.862118 4773 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-bjtnp container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 20 18:33:07 crc kubenswrapper[4773]: I0120 18:33:07.862727 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 18:33:07 crc kubenswrapper[4773]: I0120 18:33:07.920583 4773 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-gnvxz container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 20 18:33:07 crc kubenswrapper[4773]: I0120 18:33:07.920739 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 20 18:33:10 crc kubenswrapper[4773]: E0120 18:33:10.030737 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-v75d6" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.111564 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.139502 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn"] Jan 20 18:33:10 crc kubenswrapper[4773]: E0120 18:33:10.139970 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d" containerName="pruner" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.139989 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d" containerName="pruner" Jan 20 18:33:10 crc kubenswrapper[4773]: E0120 18:33:10.140010 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.140018 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" Jan 20 18:33:10 crc kubenswrapper[4773]: E0120 18:33:10.140025 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2c577af-d1a7-40f0-99be-340543255117" containerName="pruner" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.140032 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c577af-d1a7-40f0-99be-340543255117" containerName="pruner" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.140256 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2bd2b40-586f-4a4a-94b9-8cbb9be4ec3d" containerName="pruner" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.140276 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" containerName="route-controller-manager" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.140286 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2c577af-d1a7-40f0-99be-340543255117" containerName="pruner" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.142833 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.154486 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn"] Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.191259 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.226354 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz48f\" (UniqueName: \"kubernetes.io/projected/1d49ef4e-91fb-4b98-89d9-65358c718967-kube-api-access-jz48f\") pod \"1d49ef4e-91fb-4b98-89d9-65358c718967\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.226449 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-client-ca\") pod \"1d49ef4e-91fb-4b98-89d9-65358c718967\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.226534 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d49ef4e-91fb-4b98-89d9-65358c718967-serving-cert\") pod \"1d49ef4e-91fb-4b98-89d9-65358c718967\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.226581 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-config\") pod \"1d49ef4e-91fb-4b98-89d9-65358c718967\" (UID: \"1d49ef4e-91fb-4b98-89d9-65358c718967\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.227971 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-client-ca" (OuterVolumeSpecName: "client-ca") pod "1d49ef4e-91fb-4b98-89d9-65358c718967" (UID: "1d49ef4e-91fb-4b98-89d9-65358c718967"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.228250 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-config" (OuterVolumeSpecName: "config") pod "1d49ef4e-91fb-4b98-89d9-65358c718967" (UID: "1d49ef4e-91fb-4b98-89d9-65358c718967"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.235259 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d49ef4e-91fb-4b98-89d9-65358c718967-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1d49ef4e-91fb-4b98-89d9-65358c718967" (UID: "1d49ef4e-91fb-4b98-89d9-65358c718967"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.240071 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d49ef4e-91fb-4b98-89d9-65358c718967-kube-api-access-jz48f" (OuterVolumeSpecName: "kube-api-access-jz48f") pod "1d49ef4e-91fb-4b98-89d9-65358c718967" (UID: "1d49ef4e-91fb-4b98-89d9-65358c718967"). InnerVolumeSpecName "kube-api-access-jz48f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.306957 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-xwzh9" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.331709 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/419120bb-3f1b-4f21-adf5-ac057bd5dce6-serving-cert\") pod \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.331759 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-client-ca\") pod \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.331778 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-proxy-ca-bundles\") pod \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.331808 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-config\") pod \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.331833 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2mg8\" (UniqueName: \"kubernetes.io/projected/419120bb-3f1b-4f21-adf5-ac057bd5dce6-kube-api-access-l2mg8\") pod \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\" (UID: \"419120bb-3f1b-4f21-adf5-ac057bd5dce6\") " Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332066 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-config\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332105 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-client-ca\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332174 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efe70f9-78a3-4abf-b920-03868c3f9041-serving-cert\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332193 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bfqp\" (UniqueName: \"kubernetes.io/projected/7efe70f9-78a3-4abf-b920-03868c3f9041-kube-api-access-8bfqp\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332238 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332249 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1d49ef4e-91fb-4b98-89d9-65358c718967-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332259 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1d49ef4e-91fb-4b98-89d9-65358c718967-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.332268 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jz48f\" (UniqueName: \"kubernetes.io/projected/1d49ef4e-91fb-4b98-89d9-65358c718967-kube-api-access-jz48f\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.333334 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-client-ca" (OuterVolumeSpecName: "client-ca") pod "419120bb-3f1b-4f21-adf5-ac057bd5dce6" (UID: "419120bb-3f1b-4f21-adf5-ac057bd5dce6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.334768 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-config" (OuterVolumeSpecName: "config") pod "419120bb-3f1b-4f21-adf5-ac057bd5dce6" (UID: "419120bb-3f1b-4f21-adf5-ac057bd5dce6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.335047 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "419120bb-3f1b-4f21-adf5-ac057bd5dce6" (UID: "419120bb-3f1b-4f21-adf5-ac057bd5dce6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.341256 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/419120bb-3f1b-4f21-adf5-ac057bd5dce6-kube-api-access-l2mg8" (OuterVolumeSpecName: "kube-api-access-l2mg8") pod "419120bb-3f1b-4f21-adf5-ac057bd5dce6" (UID: "419120bb-3f1b-4f21-adf5-ac057bd5dce6"). InnerVolumeSpecName "kube-api-access-l2mg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.341448 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/419120bb-3f1b-4f21-adf5-ac057bd5dce6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "419120bb-3f1b-4f21-adf5-ac057bd5dce6" (UID: "419120bb-3f1b-4f21-adf5-ac057bd5dce6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.389339 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-4jpbd"] Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.434892 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efe70f9-78a3-4abf-b920-03868c3f9041-serving-cert\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.434984 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8bfqp\" (UniqueName: \"kubernetes.io/projected/7efe70f9-78a3-4abf-b920-03868c3f9041-kube-api-access-8bfqp\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435017 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-config\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435053 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-client-ca\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435093 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/419120bb-3f1b-4f21-adf5-ac057bd5dce6-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435105 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435116 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435129 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/419120bb-3f1b-4f21-adf5-ac057bd5dce6-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.435143 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2mg8\" (UniqueName: \"kubernetes.io/projected/419120bb-3f1b-4f21-adf5-ac057bd5dce6-kube-api-access-l2mg8\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.436451 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-client-ca\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.436583 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-config\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.439256 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efe70f9-78a3-4abf-b920-03868c3f9041-serving-cert\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.457966 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bfqp\" (UniqueName: \"kubernetes.io/projected/7efe70f9-78a3-4abf-b920-03868c3f9041-kube-api-access-8bfqp\") pod \"route-controller-manager-588997d685-k4wmn\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.471810 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.471823 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-bjtnp" event={"ID":"419120bb-3f1b-4f21-adf5-ac057bd5dce6","Type":"ContainerDied","Data":"0be8a6611d182b19e3c280dba8bf32b32d7fa146a5cc6f6279d1419ba167e1bf"} Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.471917 4773 scope.go:117] "RemoveContainer" containerID="0a049498de4a817a942a2b87af4cddb85094398aca02919519cc51c9033fb122" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.473838 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" event={"ID":"1d49ef4e-91fb-4b98-89d9-65358c718967","Type":"ContainerDied","Data":"3d79ab25a80551aca70e1318ac4565dde71b20dd660b9be7dfd85ec349787632"} Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.473968 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.502107 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bjtnp"] Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.505614 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.505811 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-bjtnp"] Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.515316 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz"] Jan 20 18:33:10 crc kubenswrapper[4773]: I0120 18:33:10.517754 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-gnvxz"] Jan 20 18:33:11 crc kubenswrapper[4773]: I0120 18:33:11.461036 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d49ef4e-91fb-4b98-89d9-65358c718967" path="/var/lib/kubelet/pods/1d49ef4e-91fb-4b98-89d9-65358c718967/volumes" Jan 20 18:33:11 crc kubenswrapper[4773]: I0120 18:33:11.461830 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" path="/var/lib/kubelet/pods/419120bb-3f1b-4f21-adf5-ac057bd5dce6/volumes" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.492392 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" event={"ID":"3791c4b7-dcef-470d-a67e-a2c0bb004436","Type":"ContainerStarted","Data":"bb794a0476596059002b2dbcd77c1dfb954fb595fb1c1ebf0af0fd989e74c7f0"} Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.806741 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7b574fff6d-mnxtv"] Jan 20 18:33:13 crc kubenswrapper[4773]: E0120 18:33:13.807270 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.807284 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.807372 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="419120bb-3f1b-4f21-adf5-ac057bd5dce6" containerName="controller-manager" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.807759 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.810786 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.811164 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.811337 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.811493 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.811631 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.811800 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.830787 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b574fff6d-mnxtv"] Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.836980 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.887476 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn"] Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.927137 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b94np\" (UniqueName: \"kubernetes.io/projected/383b1abe-3796-4b98-bb28-515ce7eafd6b-kube-api-access-b94np\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.927414 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383b1abe-3796-4b98-bb28-515ce7eafd6b-serving-cert\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.927582 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-config\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.927692 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-client-ca\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:13 crc kubenswrapper[4773]: I0120 18:33:13.927801 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-proxy-ca-bundles\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.028988 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-config\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.029054 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-client-ca\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.029094 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-proxy-ca-bundles\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.029136 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b94np\" (UniqueName: \"kubernetes.io/projected/383b1abe-3796-4b98-bb28-515ce7eafd6b-kube-api-access-b94np\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.029167 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383b1abe-3796-4b98-bb28-515ce7eafd6b-serving-cert\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.032630 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-proxy-ca-bundles\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.034261 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-client-ca\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.039666 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383b1abe-3796-4b98-bb28-515ce7eafd6b-serving-cert\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.039690 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-config\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.045321 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b94np\" (UniqueName: \"kubernetes.io/projected/383b1abe-3796-4b98-bb28-515ce7eafd6b-kube-api-access-b94np\") pod \"controller-manager-7b574fff6d-mnxtv\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.135983 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.691199 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 20 18:33:14 crc kubenswrapper[4773]: I0120 18:33:14.835121 4773 scope.go:117] "RemoveContainer" containerID="bdc540ba6443004408ae797654a9bce71e44b1154765597934dc3ae831e5cade" Jan 20 18:33:14 crc kubenswrapper[4773]: E0120 18:33:14.903066 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 20 18:33:14 crc kubenswrapper[4773]: E0120 18:33:14.903498 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s9wnc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-wmbvt_openshift-marketplace(86872811-c0ef-45cc-949a-f88b07fca9b3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 20 18:33:14 crc kubenswrapper[4773]: E0120 18:33:14.904651 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-wmbvt" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.086453 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn"] Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.186421 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7b574fff6d-mnxtv"] Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.505778 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerStarted","Data":"1f287c2574683f5354d74f9901af35491b27963de1526d87ad8eff7eb251368c"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.508302 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerStarted","Data":"2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.510347 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerStarted","Data":"3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.513438 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" event={"ID":"383b1abe-3796-4b98-bb28-515ce7eafd6b","Type":"ContainerStarted","Data":"90446c16d64bd4019cc2e68d6478f0e637b6dc8de10aa866b35345c2046c26a3"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.520070 4773 generic.go:334] "Generic (PLEG): container finished" podID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerID="7365c0ca8367ca905e8c8072eaa82c4d3a22f3e79b23135efcc97f413135c4e2" exitCode=0 Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.520152 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8jjn" event={"ID":"48e32f25-29eb-4ef0-892b-0da316c47e3d","Type":"ContainerDied","Data":"7365c0ca8367ca905e8c8072eaa82c4d3a22f3e79b23135efcc97f413135c4e2"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.524959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" event={"ID":"3791c4b7-dcef-470d-a67e-a2c0bb004436","Type":"ContainerStarted","Data":"3eca4f59eed7f9fc182e2dae90196776347430effa57714afcf7767e9e086b4a"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.534699 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" event={"ID":"7efe70f9-78a3-4abf-b920-03868c3f9041","Type":"ContainerStarted","Data":"12809ae984dc039517fe1d4003dbfb41a5b5900acaed078c32869d8cdfb24334"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.534758 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" event={"ID":"7efe70f9-78a3-4abf-b920-03868c3f9041","Type":"ContainerStarted","Data":"33124df33592983badc5fec6c563eafa6993c0b70e61eb8905604ffbda8d39c7"} Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.534963 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" podUID="7efe70f9-78a3-4abf-b920-03868c3f9041" containerName="route-controller-manager" containerID="cri-o://12809ae984dc039517fe1d4003dbfb41a5b5900acaed078c32869d8cdfb24334" gracePeriod=30 Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.535456 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.546388 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerStarted","Data":"884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138"} Jan 20 18:33:15 crc kubenswrapper[4773]: E0120 18:33:15.549905 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-wmbvt" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.643533 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" podStartSLOduration=21.643511453 podStartE2EDuration="21.643511453s" podCreationTimestamp="2026-01-20 18:32:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:15.64300614 +0000 UTC m=+188.564819164" watchObservedRunningTime="2026-01-20 18:33:15.643511453 +0000 UTC m=+188.565324477" Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.728487 4773 patch_prober.go:28] interesting pod/route-controller-manager-588997d685-k4wmn container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": read tcp 10.217.0.2:50614->10.217.0.54:8443: read: connection reset by peer" start-of-body= Jan 20 18:33:15 crc kubenswrapper[4773]: I0120 18:33:15.731150 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" podUID="7efe70f9-78a3-4abf-b920-03868c3f9041" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": read tcp 10.217.0.2:50614->10.217.0.54:8443: read: connection reset by peer" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.554430 4773 generic.go:334] "Generic (PLEG): container finished" podID="e5fd624a-2fa6-4887-83e0-779057846c71" containerID="3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db" exitCode=0 Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.555048 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerDied","Data":"3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.557174 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-588997d685-k4wmn_7efe70f9-78a3-4abf-b920-03868c3f9041/route-controller-manager/0.log" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.557225 4773 generic.go:334] "Generic (PLEG): container finished" podID="7efe70f9-78a3-4abf-b920-03868c3f9041" containerID="12809ae984dc039517fe1d4003dbfb41a5b5900acaed078c32869d8cdfb24334" exitCode=255 Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.557343 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" event={"ID":"7efe70f9-78a3-4abf-b920-03868c3f9041","Type":"ContainerDied","Data":"12809ae984dc039517fe1d4003dbfb41a5b5900acaed078c32869d8cdfb24334"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.557398 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" event={"ID":"7efe70f9-78a3-4abf-b920-03868c3f9041","Type":"ContainerDied","Data":"33124df33592983badc5fec6c563eafa6993c0b70e61eb8905604ffbda8d39c7"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.557410 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="33124df33592983badc5fec6c563eafa6993c0b70e61eb8905604ffbda8d39c7" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.560766 4773 generic.go:334] "Generic (PLEG): container finished" podID="a6759422-151d-4228-b7c7-848c3008fb52" containerID="884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138" exitCode=0 Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.560887 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerDied","Data":"884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.564367 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-4jpbd" event={"ID":"3791c4b7-dcef-470d-a67e-a2c0bb004436","Type":"ContainerStarted","Data":"038e90b2feb4b20bafc6c25c832f9bbbc4b1cda068561d44d6a6b1c77e3b7d7f"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.571347 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" event={"ID":"383b1abe-3796-4b98-bb28-515ce7eafd6b","Type":"ContainerStarted","Data":"919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.571630 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.576004 4773 generic.go:334] "Generic (PLEG): container finished" podID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerID="1f287c2574683f5354d74f9901af35491b27963de1526d87ad8eff7eb251368c" exitCode=0 Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.576099 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerDied","Data":"1f287c2574683f5354d74f9901af35491b27963de1526d87ad8eff7eb251368c"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.579269 4773 generic.go:334] "Generic (PLEG): container finished" podID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerID="2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088" exitCode=0 Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.579327 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerDied","Data":"2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088"} Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.579726 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.593908 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-588997d685-k4wmn_7efe70f9-78a3-4abf-b920-03868c3f9041/route-controller-manager/0.log" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.594011 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.609610 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" podStartSLOduration=3.609577256 podStartE2EDuration="3.609577256s" podCreationTimestamp="2026-01-20 18:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:16.605167357 +0000 UTC m=+189.526980381" watchObservedRunningTime="2026-01-20 18:33:16.609577256 +0000 UTC m=+189.531390280" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.624756 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-4jpbd" podStartSLOduration=170.624738409 podStartE2EDuration="2m50.624738409s" podCreationTimestamp="2026-01-20 18:30:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:16.620750842 +0000 UTC m=+189.542563866" watchObservedRunningTime="2026-01-20 18:33:16.624738409 +0000 UTC m=+189.546551433" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.708675 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4"] Jan 20 18:33:16 crc kubenswrapper[4773]: E0120 18:33:16.708900 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7efe70f9-78a3-4abf-b920-03868c3f9041" containerName="route-controller-manager" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.708916 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7efe70f9-78a3-4abf-b920-03868c3f9041" containerName="route-controller-manager" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.709037 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7efe70f9-78a3-4abf-b920-03868c3f9041" containerName="route-controller-manager" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.709397 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.788253 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bfqp\" (UniqueName: \"kubernetes.io/projected/7efe70f9-78a3-4abf-b920-03868c3f9041-kube-api-access-8bfqp\") pod \"7efe70f9-78a3-4abf-b920-03868c3f9041\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.788325 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-config\") pod \"7efe70f9-78a3-4abf-b920-03868c3f9041\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.788472 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efe70f9-78a3-4abf-b920-03868c3f9041-serving-cert\") pod \"7efe70f9-78a3-4abf-b920-03868c3f9041\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.788555 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-client-ca\") pod \"7efe70f9-78a3-4abf-b920-03868c3f9041\" (UID: \"7efe70f9-78a3-4abf-b920-03868c3f9041\") " Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.789421 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-client-ca" (OuterVolumeSpecName: "client-ca") pod "7efe70f9-78a3-4abf-b920-03868c3f9041" (UID: "7efe70f9-78a3-4abf-b920-03868c3f9041"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.789761 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-config" (OuterVolumeSpecName: "config") pod "7efe70f9-78a3-4abf-b920-03868c3f9041" (UID: "7efe70f9-78a3-4abf-b920-03868c3f9041"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.890140 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-client-ca\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.890211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-config\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.890241 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8jhp5\" (UniqueName: \"kubernetes.io/projected/4033f4d4-e3aa-40fd-b5f3-558833a6846d-kube-api-access-8jhp5\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.890430 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4033f4d4-e3aa-40fd-b5f3-558833a6846d-serving-cert\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.890713 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.890743 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7efe70f9-78a3-4abf-b920-03868c3f9041-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.992006 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-config\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.992059 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8jhp5\" (UniqueName: \"kubernetes.io/projected/4033f4d4-e3aa-40fd-b5f3-558833a6846d-kube-api-access-8jhp5\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.992127 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4033f4d4-e3aa-40fd-b5f3-558833a6846d-serving-cert\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.992198 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-client-ca\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.993283 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-config\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:16 crc kubenswrapper[4773]: I0120 18:33:16.994279 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-client-ca\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.008077 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4033f4d4-e3aa-40fd-b5f3-558833a6846d-serving-cert\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.676981 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7efe70f9-78a3-4abf-b920-03868c3f9041-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7efe70f9-78a3-4abf-b920-03868c3f9041" (UID: "7efe70f9-78a3-4abf-b920-03868c3f9041"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.679345 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7efe70f9-78a3-4abf-b920-03868c3f9041-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.680001 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7efe70f9-78a3-4abf-b920-03868c3f9041-kube-api-access-8bfqp" (OuterVolumeSpecName: "kube-api-access-8bfqp") pod "7efe70f9-78a3-4abf-b920-03868c3f9041" (UID: "7efe70f9-78a3-4abf-b920-03868c3f9041"). InnerVolumeSpecName "kube-api-access-8bfqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.698169 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.748363 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8jhp5\" (UniqueName: \"kubernetes.io/projected/4033f4d4-e3aa-40fd-b5f3-558833a6846d-kube-api-access-8jhp5\") pod \"route-controller-manager-7fcf879f88-b2ff4\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.752767 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4"] Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.780591 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8bfqp\" (UniqueName: \"kubernetes.io/projected/7efe70f9-78a3-4abf-b920-03868c3f9041-kube-api-access-8bfqp\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.789905 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn"] Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.822338 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-588997d685-k4wmn"] Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.827495 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xpsls"] Jan 20 18:33:17 crc kubenswrapper[4773]: I0120 18:33:17.929813 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.025142 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.027276 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.041419 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.043236 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.048144 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.085675 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.085805 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.187108 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.187180 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.187301 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.237682 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.366211 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.612086 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 20 18:33:18 crc kubenswrapper[4773]: W0120 18:33:18.622180 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0e6b5555_e3e7_43c5_8fdc_090dcfda09bc.slice/crio-296da45aadcbb1bf8c349035858ff7b4e1059b1b8a266097feb34aadaf676510 WatchSource:0}: Error finding container 296da45aadcbb1bf8c349035858ff7b4e1059b1b8a266097feb34aadaf676510: Status 404 returned error can't find the container with id 296da45aadcbb1bf8c349035858ff7b4e1059b1b8a266097feb34aadaf676510 Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.701277 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4"] Jan 20 18:33:18 crc kubenswrapper[4773]: W0120 18:33:18.709913 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4033f4d4_e3aa_40fd_b5f3_558833a6846d.slice/crio-90f66662d4a9236eb6999aa5e1e89c5b8f25f88adee39e793659bbf3a133e296 WatchSource:0}: Error finding container 90f66662d4a9236eb6999aa5e1e89c5b8f25f88adee39e793659bbf3a133e296: Status 404 returned error can't find the container with id 90f66662d4a9236eb6999aa5e1e89c5b8f25f88adee39e793659bbf3a133e296 Jan 20 18:33:18 crc kubenswrapper[4773]: I0120 18:33:18.711596 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc","Type":"ContainerStarted","Data":"296da45aadcbb1bf8c349035858ff7b4e1059b1b8a266097feb34aadaf676510"} Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.455445 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7efe70f9-78a3-4abf-b920-03868c3f9041" path="/var/lib/kubelet/pods/7efe70f9-78a3-4abf-b920-03868c3f9041/volumes" Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.721783 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8jjn" event={"ID":"48e32f25-29eb-4ef0-892b-0da316c47e3d","Type":"ContainerStarted","Data":"17d9f67433a276b33dd0f9ec1716d52105d5c885fdd1e9c3212c3baefad6b560"} Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.725277 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc","Type":"ContainerStarted","Data":"6c8071cb5501b88f60518fbf4d2fbb0ad26f5f049b5a5f7f02f1f0c839540f17"} Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.727308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" event={"ID":"4033f4d4-e3aa-40fd-b5f3-558833a6846d","Type":"ContainerStarted","Data":"9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7"} Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.727352 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" event={"ID":"4033f4d4-e3aa-40fd-b5f3-558833a6846d","Type":"ContainerStarted","Data":"90f66662d4a9236eb6999aa5e1e89c5b8f25f88adee39e793659bbf3a133e296"} Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.727691 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.741946 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-c8jjn" podStartSLOduration=3.620695945 podStartE2EDuration="42.741897325s" podCreationTimestamp="2026-01-20 18:32:37 +0000 UTC" firstStartedPulling="2026-01-20 18:32:39.052839029 +0000 UTC m=+151.974652053" lastFinishedPulling="2026-01-20 18:33:18.174040409 +0000 UTC m=+191.095853433" observedRunningTime="2026-01-20 18:33:19.741021853 +0000 UTC m=+192.662834877" watchObservedRunningTime="2026-01-20 18:33:19.741897325 +0000 UTC m=+192.663710349" Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.762278 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" podStartSLOduration=6.762257446 podStartE2EDuration="6.762257446s" podCreationTimestamp="2026-01-20 18:33:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:19.760914793 +0000 UTC m=+192.682727817" watchObservedRunningTime="2026-01-20 18:33:19.762257446 +0000 UTC m=+192.684070470" Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.778323 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=1.778305242 podStartE2EDuration="1.778305242s" podCreationTimestamp="2026-01-20 18:33:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:19.776288412 +0000 UTC m=+192.698101436" watchObservedRunningTime="2026-01-20 18:33:19.778305242 +0000 UTC m=+192.700118256" Jan 20 18:33:19 crc kubenswrapper[4773]: I0120 18:33:19.884997 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:20 crc kubenswrapper[4773]: I0120 18:33:20.734820 4773 generic.go:334] "Generic (PLEG): container finished" podID="0e6b5555-e3e7-43c5-8fdc-090dcfda09bc" containerID="6c8071cb5501b88f60518fbf4d2fbb0ad26f5f049b5a5f7f02f1f0c839540f17" exitCode=0 Jan 20 18:33:20 crc kubenswrapper[4773]: I0120 18:33:20.734959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc","Type":"ContainerDied","Data":"6c8071cb5501b88f60518fbf4d2fbb0ad26f5f049b5a5f7f02f1f0c839540f17"} Jan 20 18:33:21 crc kubenswrapper[4773]: I0120 18:33:21.742502 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerStarted","Data":"042139096fac743c4bffa6cf49536d998523d940f380114bd1fdad3392d0743d"} Jan 20 18:33:21 crc kubenswrapper[4773]: I0120 18:33:21.765640 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2qdpl" podStartSLOduration=3.336568662 podStartE2EDuration="45.765613998s" podCreationTimestamp="2026-01-20 18:32:36 +0000 UTC" firstStartedPulling="2026-01-20 18:32:38.048052181 +0000 UTC m=+150.969865205" lastFinishedPulling="2026-01-20 18:33:20.477097517 +0000 UTC m=+193.398910541" observedRunningTime="2026-01-20 18:33:21.762625895 +0000 UTC m=+194.684438919" watchObservedRunningTime="2026-01-20 18:33:21.765613998 +0000 UTC m=+194.687427022" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.507380 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.694793 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kubelet-dir\") pod \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.695085 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "0e6b5555-e3e7-43c5-8fdc-090dcfda09bc" (UID: "0e6b5555-e3e7-43c5-8fdc-090dcfda09bc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.695800 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kube-api-access\") pod \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\" (UID: \"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc\") " Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.696075 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.699517 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0e6b5555-e3e7-43c5-8fdc-090dcfda09bc" (UID: "0e6b5555-e3e7-43c5-8fdc-090dcfda09bc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.753630 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"0e6b5555-e3e7-43c5-8fdc-090dcfda09bc","Type":"ContainerDied","Data":"296da45aadcbb1bf8c349035858ff7b4e1059b1b8a266097feb34aadaf676510"} Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.753674 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="296da45aadcbb1bf8c349035858ff7b4e1059b1b8a266097feb34aadaf676510" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.753684 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 20 18:33:23 crc kubenswrapper[4773]: I0120 18:33:23.797272 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0e6b5555-e3e7-43c5-8fdc-090dcfda09bc-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.639776 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 20 18:33:26 crc kubenswrapper[4773]: E0120 18:33:26.640548 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e6b5555-e3e7-43c5-8fdc-090dcfda09bc" containerName="pruner" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.640571 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e6b5555-e3e7-43c5-8fdc-090dcfda09bc" containerName="pruner" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.640774 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e6b5555-e3e7-43c5-8fdc-090dcfda09bc" containerName="pruner" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.641487 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.645088 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.645415 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.646305 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.737016 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.737361 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29920243-d87d-49b3-9215-680935300c6e-kube-api-access\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.737511 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-var-lock\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.838308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.838565 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29920243-d87d-49b3-9215-680935300c6e-kube-api-access\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.838665 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-var-lock\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.838417 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-kubelet-dir\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.838816 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-var-lock\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.860806 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29920243-d87d-49b3-9215-680935300c6e-kube-api-access\") pod \"installer-9-crc\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:26 crc kubenswrapper[4773]: I0120 18:33:26.976602 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.129212 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.129491 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.223009 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.752352 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.752979 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.820452 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.823768 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:33:27 crc kubenswrapper[4773]: I0120 18:33:27.882513 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.170136 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.170259 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.170403 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.171296 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.171566 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24" gracePeriod=600 Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.453087 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c8jjn"] Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.604469 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 20 18:33:28 crc kubenswrapper[4773]: W0120 18:33:28.614012 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod29920243_d87d_49b3_9215_680935300c6e.slice/crio-425d04b8ab5f331609cc0a966bf16d14189893447909935dd72ec164f756a917 WatchSource:0}: Error finding container 425d04b8ab5f331609cc0a966bf16d14189893447909935dd72ec164f756a917: Status 404 returned error can't find the container with id 425d04b8ab5f331609cc0a966bf16d14189893447909935dd72ec164f756a917 Jan 20 18:33:28 crc kubenswrapper[4773]: I0120 18:33:28.781330 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29920243-d87d-49b3-9215-680935300c6e","Type":"ContainerStarted","Data":"425d04b8ab5f331609cc0a966bf16d14189893447909935dd72ec164f756a917"} Jan 20 18:33:29 crc kubenswrapper[4773]: I0120 18:33:29.788846 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24" exitCode=0 Jan 20 18:33:29 crc kubenswrapper[4773]: I0120 18:33:29.788914 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24"} Jan 20 18:33:29 crc kubenswrapper[4773]: I0120 18:33:29.790662 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-c8jjn" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="registry-server" containerID="cri-o://17d9f67433a276b33dd0f9ec1716d52105d5c885fdd1e9c3212c3baefad6b560" gracePeriod=2 Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.797578 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerStarted","Data":"2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745"} Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.799876 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerStarted","Data":"405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1"} Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.803337 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerStarted","Data":"cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b"} Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.804803 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29920243-d87d-49b3-9215-680935300c6e","Type":"ContainerStarted","Data":"87629fae04c4c3f2e756aceedc76b8a1d45cb15dd8a81bfaef042af220c86cad"} Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.806783 4773 generic.go:334] "Generic (PLEG): container finished" podID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerID="17d9f67433a276b33dd0f9ec1716d52105d5c885fdd1e9c3212c3baefad6b560" exitCode=0 Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.806838 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8jjn" event={"ID":"48e32f25-29eb-4ef0-892b-0da316c47e3d","Type":"ContainerDied","Data":"17d9f67433a276b33dd0f9ec1716d52105d5c885fdd1e9c3212c3baefad6b560"} Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.808544 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerID="93be4aeabcc19519cd8451ce33f2117a534a92e9ae0b9b81378e69932d400b91" exitCode=0 Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.808575 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4bcd" event={"ID":"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2","Type":"ContainerDied","Data":"93be4aeabcc19519cd8451ce33f2117a534a92e9ae0b9b81378e69932d400b91"} Jan 20 18:33:30 crc kubenswrapper[4773]: I0120 18:33:30.821306 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fm4ln" podStartSLOduration=9.12255177 podStartE2EDuration="51.821259024s" podCreationTimestamp="2026-01-20 18:32:39 +0000 UTC" firstStartedPulling="2026-01-20 18:32:42.175419128 +0000 UTC m=+155.097232152" lastFinishedPulling="2026-01-20 18:33:24.874126392 +0000 UTC m=+197.795939406" observedRunningTime="2026-01-20 18:33:30.817488887 +0000 UTC m=+203.739301911" watchObservedRunningTime="2026-01-20 18:33:30.821259024 +0000 UTC m=+203.743072048" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.384680 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.404543 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6lqws" podStartSLOduration=4.3001280170000005 podStartE2EDuration="54.404522848s" podCreationTimestamp="2026-01-20 18:32:37 +0000 UTC" firstStartedPulling="2026-01-20 18:32:38.030511719 +0000 UTC m=+150.952324733" lastFinishedPulling="2026-01-20 18:33:28.13490651 +0000 UTC m=+201.056719564" observedRunningTime="2026-01-20 18:33:30.839746086 +0000 UTC m=+203.761559110" watchObservedRunningTime="2026-01-20 18:33:31.404522848 +0000 UTC m=+204.326335872" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.524867 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-catalog-content\") pod \"48e32f25-29eb-4ef0-892b-0da316c47e3d\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.524939 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-utilities\") pod \"48e32f25-29eb-4ef0-892b-0da316c47e3d\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.524984 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gt4kw\" (UniqueName: \"kubernetes.io/projected/48e32f25-29eb-4ef0-892b-0da316c47e3d-kube-api-access-gt4kw\") pod \"48e32f25-29eb-4ef0-892b-0da316c47e3d\" (UID: \"48e32f25-29eb-4ef0-892b-0da316c47e3d\") " Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.549473 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-utilities" (OuterVolumeSpecName: "utilities") pod "48e32f25-29eb-4ef0-892b-0da316c47e3d" (UID: "48e32f25-29eb-4ef0-892b-0da316c47e3d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.560205 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48e32f25-29eb-4ef0-892b-0da316c47e3d-kube-api-access-gt4kw" (OuterVolumeSpecName: "kube-api-access-gt4kw") pod "48e32f25-29eb-4ef0-892b-0da316c47e3d" (UID: "48e32f25-29eb-4ef0-892b-0da316c47e3d"). InnerVolumeSpecName "kube-api-access-gt4kw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.592698 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "48e32f25-29eb-4ef0-892b-0da316c47e3d" (UID: "48e32f25-29eb-4ef0-892b-0da316c47e3d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.626201 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gt4kw\" (UniqueName: \"kubernetes.io/projected/48e32f25-29eb-4ef0-892b-0da316c47e3d-kube-api-access-gt4kw\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.626232 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.626242 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/48e32f25-29eb-4ef0-892b-0da316c47e3d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.817266 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"6fc4791a7cdd1fbda5ec9ded78a8be5e2c44fe7359d840cfe4a8ada84728d5d6"} Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.819221 4773 generic.go:334] "Generic (PLEG): container finished" podID="074f367d-7a48-4046-a679-9a2d38111b8a" containerID="639cb92015efe00db0ab47ee3303403c16b478767a9030340d303516dfaf2e8d" exitCode=0 Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.819276 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75d6" event={"ID":"074f367d-7a48-4046-a679-9a2d38111b8a","Type":"ContainerDied","Data":"639cb92015efe00db0ab47ee3303403c16b478767a9030340d303516dfaf2e8d"} Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.822009 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-c8jjn" event={"ID":"48e32f25-29eb-4ef0-892b-0da316c47e3d","Type":"ContainerDied","Data":"6b054655660c22424d85e72874df79577e584d37d89d639cfdd49a06b904b839"} Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.822032 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-c8jjn" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.822088 4773 scope.go:117] "RemoveContainer" containerID="17d9f67433a276b33dd0f9ec1716d52105d5c885fdd1e9c3212c3baefad6b560" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.824392 4773 generic.go:334] "Generic (PLEG): container finished" podID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerID="cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8" exitCode=0 Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.825085 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmbvt" event={"ID":"86872811-c0ef-45cc-949a-f88b07fca9b3","Type":"ContainerDied","Data":"cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8"} Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.849493 4773 scope.go:117] "RemoveContainer" containerID="7365c0ca8367ca905e8c8072eaa82c4d3a22f3e79b23135efcc97f413135c4e2" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.897165 4773 scope.go:117] "RemoveContainer" containerID="40513f62f7dee095227f397c99bc022e2d2616f999a31014e1b18abdc02ed257" Jan 20 18:33:31 crc kubenswrapper[4773]: I0120 18:33:31.972146 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=5.972128104 podStartE2EDuration="5.972128104s" podCreationTimestamp="2026-01-20 18:33:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:31.96846414 +0000 UTC m=+204.890277164" watchObservedRunningTime="2026-01-20 18:33:31.972128104 +0000 UTC m=+204.893941128" Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.025590 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kxsfk" podStartSLOduration=6.435064255 podStartE2EDuration="52.025570741s" podCreationTimestamp="2026-01-20 18:32:40 +0000 UTC" firstStartedPulling="2026-01-20 18:32:42.149969251 +0000 UTC m=+155.071782275" lastFinishedPulling="2026-01-20 18:33:27.740475697 +0000 UTC m=+200.662288761" observedRunningTime="2026-01-20 18:33:32.022140223 +0000 UTC m=+204.943953247" watchObservedRunningTime="2026-01-20 18:33:32.025570741 +0000 UTC m=+204.947383765" Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.050971 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-c8jjn"] Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.053576 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-c8jjn"] Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.831179 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4bcd" event={"ID":"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2","Type":"ContainerStarted","Data":"f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07"} Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.835901 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75d6" event={"ID":"074f367d-7a48-4046-a679-9a2d38111b8a","Type":"ContainerStarted","Data":"20afc5cca64346d7cedcc97043ec4130c07cbb4a3ad2fa66f0da171dcc8edd41"} Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.839142 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmbvt" event={"ID":"86872811-c0ef-45cc-949a-f88b07fca9b3","Type":"ContainerStarted","Data":"04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f"} Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.856229 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-w4bcd" podStartSLOduration=3.749818164 podStartE2EDuration="54.856212287s" podCreationTimestamp="2026-01-20 18:32:38 +0000 UTC" firstStartedPulling="2026-01-20 18:32:41.147165033 +0000 UTC m=+154.068978057" lastFinishedPulling="2026-01-20 18:33:32.253559156 +0000 UTC m=+205.175372180" observedRunningTime="2026-01-20 18:33:32.854806821 +0000 UTC m=+205.776619845" watchObservedRunningTime="2026-01-20 18:33:32.856212287 +0000 UTC m=+205.778025311" Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.875318 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wmbvt" podStartSLOduration=2.684900787 podStartE2EDuration="53.875298536s" podCreationTimestamp="2026-01-20 18:32:39 +0000 UTC" firstStartedPulling="2026-01-20 18:32:41.100772039 +0000 UTC m=+154.022585063" lastFinishedPulling="2026-01-20 18:33:32.291169768 +0000 UTC m=+205.212982812" observedRunningTime="2026-01-20 18:33:32.873196681 +0000 UTC m=+205.795009715" watchObservedRunningTime="2026-01-20 18:33:32.875298536 +0000 UTC m=+205.797111560" Jan 20 18:33:32 crc kubenswrapper[4773]: I0120 18:33:32.891603 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v75d6" podStartSLOduration=2.652243463 podStartE2EDuration="56.891579262s" podCreationTimestamp="2026-01-20 18:32:36 +0000 UTC" firstStartedPulling="2026-01-20 18:32:38.026421368 +0000 UTC m=+150.948234392" lastFinishedPulling="2026-01-20 18:33:32.265757167 +0000 UTC m=+205.187570191" observedRunningTime="2026-01-20 18:33:32.891259674 +0000 UTC m=+205.813072708" watchObservedRunningTime="2026-01-20 18:33:32.891579262 +0000 UTC m=+205.813392286" Jan 20 18:33:33 crc kubenswrapper[4773]: I0120 18:33:33.454286 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" path="/var/lib/kubelet/pods/48e32f25-29eb-4ef0-892b-0da316c47e3d/volumes" Jan 20 18:33:33 crc kubenswrapper[4773]: I0120 18:33:33.782241 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b574fff6d-mnxtv"] Jan 20 18:33:33 crc kubenswrapper[4773]: I0120 18:33:33.782681 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" podUID="383b1abe-3796-4b98-bb28-515ce7eafd6b" containerName="controller-manager" containerID="cri-o://919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d" gracePeriod=30 Jan 20 18:33:33 crc kubenswrapper[4773]: I0120 18:33:33.819203 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4"] Jan 20 18:33:33 crc kubenswrapper[4773]: I0120 18:33:33.819425 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" podUID="4033f4d4-e3aa-40fd-b5f3-558833a6846d" containerName="route-controller-manager" containerID="cri-o://9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7" gracePeriod=30 Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.355563 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.429418 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.460787 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-client-ca\") pod \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.460854 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-config\") pod \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461198 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8jhp5\" (UniqueName: \"kubernetes.io/projected/4033f4d4-e3aa-40fd-b5f3-558833a6846d-kube-api-access-8jhp5\") pod \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461243 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-client-ca\") pod \"383b1abe-3796-4b98-bb28-515ce7eafd6b\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461270 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-proxy-ca-bundles\") pod \"383b1abe-3796-4b98-bb28-515ce7eafd6b\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461291 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4033f4d4-e3aa-40fd-b5f3-558833a6846d-serving-cert\") pod \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\" (UID: \"4033f4d4-e3aa-40fd-b5f3-558833a6846d\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461322 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-config\") pod \"383b1abe-3796-4b98-bb28-515ce7eafd6b\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461794 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-config" (OuterVolumeSpecName: "config") pod "4033f4d4-e3aa-40fd-b5f3-558833a6846d" (UID: "4033f4d4-e3aa-40fd-b5f3-558833a6846d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461855 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-client-ca" (OuterVolumeSpecName: "client-ca") pod "383b1abe-3796-4b98-bb28-515ce7eafd6b" (UID: "383b1abe-3796-4b98-bb28-515ce7eafd6b"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.461844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "383b1abe-3796-4b98-bb28-515ce7eafd6b" (UID: "383b1abe-3796-4b98-bb28-515ce7eafd6b"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462064 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-client-ca" (OuterVolumeSpecName: "client-ca") pod "4033f4d4-e3aa-40fd-b5f3-558833a6846d" (UID: "4033f4d4-e3aa-40fd-b5f3-558833a6846d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462092 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-config" (OuterVolumeSpecName: "config") pod "383b1abe-3796-4b98-bb28-515ce7eafd6b" (UID: "383b1abe-3796-4b98-bb28-515ce7eafd6b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462265 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b94np\" (UniqueName: \"kubernetes.io/projected/383b1abe-3796-4b98-bb28-515ce7eafd6b-kube-api-access-b94np\") pod \"383b1abe-3796-4b98-bb28-515ce7eafd6b\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462655 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462672 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4033f4d4-e3aa-40fd-b5f3-558833a6846d-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462681 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462690 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.462702 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/383b1abe-3796-4b98-bb28-515ce7eafd6b-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.466705 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/383b1abe-3796-4b98-bb28-515ce7eafd6b-kube-api-access-b94np" (OuterVolumeSpecName: "kube-api-access-b94np") pod "383b1abe-3796-4b98-bb28-515ce7eafd6b" (UID: "383b1abe-3796-4b98-bb28-515ce7eafd6b"). InnerVolumeSpecName "kube-api-access-b94np". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.466719 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4033f4d4-e3aa-40fd-b5f3-558833a6846d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4033f4d4-e3aa-40fd-b5f3-558833a6846d" (UID: "4033f4d4-e3aa-40fd-b5f3-558833a6846d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.466744 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4033f4d4-e3aa-40fd-b5f3-558833a6846d-kube-api-access-8jhp5" (OuterVolumeSpecName: "kube-api-access-8jhp5") pod "4033f4d4-e3aa-40fd-b5f3-558833a6846d" (UID: "4033f4d4-e3aa-40fd-b5f3-558833a6846d"). InnerVolumeSpecName "kube-api-access-8jhp5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.563358 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383b1abe-3796-4b98-bb28-515ce7eafd6b-serving-cert\") pod \"383b1abe-3796-4b98-bb28-515ce7eafd6b\" (UID: \"383b1abe-3796-4b98-bb28-515ce7eafd6b\") " Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.563592 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b94np\" (UniqueName: \"kubernetes.io/projected/383b1abe-3796-4b98-bb28-515ce7eafd6b-kube-api-access-b94np\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.563613 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8jhp5\" (UniqueName: \"kubernetes.io/projected/4033f4d4-e3aa-40fd-b5f3-558833a6846d-kube-api-access-8jhp5\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.563627 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4033f4d4-e3aa-40fd-b5f3-558833a6846d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.566116 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/383b1abe-3796-4b98-bb28-515ce7eafd6b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "383b1abe-3796-4b98-bb28-515ce7eafd6b" (UID: "383b1abe-3796-4b98-bb28-515ce7eafd6b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.664314 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/383b1abe-3796-4b98-bb28-515ce7eafd6b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.849422 4773 generic.go:334] "Generic (PLEG): container finished" podID="4033f4d4-e3aa-40fd-b5f3-558833a6846d" containerID="9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7" exitCode=0 Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.849486 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" event={"ID":"4033f4d4-e3aa-40fd-b5f3-558833a6846d","Type":"ContainerDied","Data":"9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7"} Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.849729 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" event={"ID":"4033f4d4-e3aa-40fd-b5f3-558833a6846d","Type":"ContainerDied","Data":"90f66662d4a9236eb6999aa5e1e89c5b8f25f88adee39e793659bbf3a133e296"} Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.849492 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.849761 4773 scope.go:117] "RemoveContainer" containerID="9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.851880 4773 generic.go:334] "Generic (PLEG): container finished" podID="383b1abe-3796-4b98-bb28-515ce7eafd6b" containerID="919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d" exitCode=0 Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.851919 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" event={"ID":"383b1abe-3796-4b98-bb28-515ce7eafd6b","Type":"ContainerDied","Data":"919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d"} Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.851962 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" event={"ID":"383b1abe-3796-4b98-bb28-515ce7eafd6b","Type":"ContainerDied","Data":"90446c16d64bd4019cc2e68d6478f0e637b6dc8de10aa866b35345c2046c26a3"} Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.852296 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.864665 4773 scope.go:117] "RemoveContainer" containerID="9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7" Jan 20 18:33:34 crc kubenswrapper[4773]: E0120 18:33:34.865551 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7\": container with ID starting with 9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7 not found: ID does not exist" containerID="9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.865610 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7"} err="failed to get container status \"9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7\": rpc error: code = NotFound desc = could not find container \"9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7\": container with ID starting with 9759082e36dae7a1d2a9627cdbf59f5c2115171e900129b0eb79cd119b12b6d7 not found: ID does not exist" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.865644 4773 scope.go:117] "RemoveContainer" containerID="919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.881291 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4"] Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.885761 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fcf879f88-b2ff4"] Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.886150 4773 scope.go:117] "RemoveContainer" containerID="919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d" Jan 20 18:33:34 crc kubenswrapper[4773]: E0120 18:33:34.887311 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d\": container with ID starting with 919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d not found: ID does not exist" containerID="919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.887356 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d"} err="failed to get container status \"919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d\": rpc error: code = NotFound desc = could not find container \"919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d\": container with ID starting with 919c3dc554513c2d70b1c0612b8d4f1f24089e976bf7e534847570d20270159d not found: ID does not exist" Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.893531 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7b574fff6d-mnxtv"] Jan 20 18:33:34 crc kubenswrapper[4773]: I0120 18:33:34.895782 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7b574fff6d-mnxtv"] Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.137352 4773 patch_prober.go:28] interesting pod/controller-manager-7b574fff6d-mnxtv container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: i/o timeout" start-of-body= Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.137644 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7b574fff6d-mnxtv" podUID="383b1abe-3796-4b98-bb28-515ce7eafd6b" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.55:8443/healthz\": dial tcp 10.217.0.55:8443: i/o timeout" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.454388 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="383b1abe-3796-4b98-bb28-515ce7eafd6b" path="/var/lib/kubelet/pods/383b1abe-3796-4b98-bb28-515ce7eafd6b/volumes" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.455182 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4033f4d4-e3aa-40fd-b5f3-558833a6846d" path="/var/lib/kubelet/pods/4033f4d4-e3aa-40fd-b5f3-558833a6846d/volumes" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598413 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2"] Jan 20 18:33:35 crc kubenswrapper[4773]: E0120 18:33:35.598643 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="383b1abe-3796-4b98-bb28-515ce7eafd6b" containerName="controller-manager" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598657 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="383b1abe-3796-4b98-bb28-515ce7eafd6b" containerName="controller-manager" Jan 20 18:33:35 crc kubenswrapper[4773]: E0120 18:33:35.598667 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="extract-content" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598673 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="extract-content" Jan 20 18:33:35 crc kubenswrapper[4773]: E0120 18:33:35.598682 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="registry-server" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598688 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="registry-server" Jan 20 18:33:35 crc kubenswrapper[4773]: E0120 18:33:35.598707 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="extract-utilities" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598713 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="extract-utilities" Jan 20 18:33:35 crc kubenswrapper[4773]: E0120 18:33:35.598724 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4033f4d4-e3aa-40fd-b5f3-558833a6846d" containerName="route-controller-manager" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598730 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4033f4d4-e3aa-40fd-b5f3-558833a6846d" containerName="route-controller-manager" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598824 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="48e32f25-29eb-4ef0-892b-0da316c47e3d" containerName="registry-server" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598838 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4033f4d4-e3aa-40fd-b5f3-558833a6846d" containerName="route-controller-manager" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.598846 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="383b1abe-3796-4b98-bb28-515ce7eafd6b" containerName="controller-manager" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.599277 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.602061 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.602248 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.603052 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.603314 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.604872 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.606874 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.614792 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6549f94c47-dcb2l"] Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.615828 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.618736 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.619202 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.619428 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.619651 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.620084 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.620277 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2"] Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.629383 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.634615 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.646596 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6549f94c47-dcb2l"] Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783313 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-proxy-ca-bundles\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783478 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9wrq\" (UniqueName: \"kubernetes.io/projected/12c12fbe-31ec-4b46-ace6-ac3451850070-kube-api-access-h9wrq\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783520 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckld2\" (UniqueName: \"kubernetes.io/projected/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-kube-api-access-ckld2\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783546 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-client-ca\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783571 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12c12fbe-31ec-4b46-ace6-ac3451850070-serving-cert\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783594 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-config\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-serving-cert\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783747 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-client-ca\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.783859 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-config\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885193 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-config\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-serving-cert\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885291 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-client-ca\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885339 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-config\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885381 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-proxy-ca-bundles\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885477 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h9wrq\" (UniqueName: \"kubernetes.io/projected/12c12fbe-31ec-4b46-ace6-ac3451850070-kube-api-access-h9wrq\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885514 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckld2\" (UniqueName: \"kubernetes.io/projected/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-kube-api-access-ckld2\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-client-ca\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.885552 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12c12fbe-31ec-4b46-ace6-ac3451850070-serving-cert\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.886912 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-client-ca\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.886959 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-config\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.887483 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-client-ca\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.888265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-proxy-ca-bundles\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.888300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-config\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.891560 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12c12fbe-31ec-4b46-ace6-ac3451850070-serving-cert\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.891728 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-serving-cert\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.900826 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckld2\" (UniqueName: \"kubernetes.io/projected/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-kube-api-access-ckld2\") pod \"route-controller-manager-9cd584848-gxtc2\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.909610 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h9wrq\" (UniqueName: \"kubernetes.io/projected/12c12fbe-31ec-4b46-ace6-ac3451850070-kube-api-access-h9wrq\") pod \"controller-manager-6549f94c47-dcb2l\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.930652 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:35 crc kubenswrapper[4773]: I0120 18:33:35.935812 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:36 crc kubenswrapper[4773]: I0120 18:33:36.365969 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2"] Jan 20 18:33:36 crc kubenswrapper[4773]: W0120 18:33:36.374591 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd65ab1d5_14b1_4f38_a627_ca6f00bb0b44.slice/crio-c1527878b5d63bc877a41560d037696931df58e57b3c0379b8a0a0fa203a1634 WatchSource:0}: Error finding container c1527878b5d63bc877a41560d037696931df58e57b3c0379b8a0a0fa203a1634: Status 404 returned error can't find the container with id c1527878b5d63bc877a41560d037696931df58e57b3c0379b8a0a0fa203a1634 Jan 20 18:33:36 crc kubenswrapper[4773]: I0120 18:33:36.423494 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6549f94c47-dcb2l"] Jan 20 18:33:36 crc kubenswrapper[4773]: I0120 18:33:36.875156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" event={"ID":"12c12fbe-31ec-4b46-ace6-ac3451850070","Type":"ContainerStarted","Data":"0d0c05a7620d0b2d084778129f9283d19b008663b280d13e07e0f66abacb3b84"} Jan 20 18:33:36 crc kubenswrapper[4773]: I0120 18:33:36.876757 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" event={"ID":"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44","Type":"ContainerStarted","Data":"c1527878b5d63bc877a41560d037696931df58e57b3c0379b8a0a0fa203a1634"} Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.312879 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.313477 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.350230 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.572219 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.572857 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.610356 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.883918 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" event={"ID":"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44","Type":"ContainerStarted","Data":"6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660"} Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.884104 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.885891 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" event={"ID":"12c12fbe-31ec-4b46-ace6-ac3451850070","Type":"ContainerStarted","Data":"cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9"} Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.891844 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.906593 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" podStartSLOduration=4.906568064 podStartE2EDuration="4.906568064s" podCreationTimestamp="2026-01-20 18:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:37.900631892 +0000 UTC m=+210.822444916" watchObservedRunningTime="2026-01-20 18:33:37.906568064 +0000 UTC m=+210.828381098" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.932148 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.937558 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:33:37 crc kubenswrapper[4773]: I0120 18:33:37.949392 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" podStartSLOduration=4.949367949 podStartE2EDuration="4.949367949s" podCreationTimestamp="2026-01-20 18:33:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:37.947407849 +0000 UTC m=+210.869220893" watchObservedRunningTime="2026-01-20 18:33:37.949367949 +0000 UTC m=+210.871180973" Jan 20 18:33:38 crc kubenswrapper[4773]: I0120 18:33:38.891826 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:38 crc kubenswrapper[4773]: I0120 18:33:38.899813 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.260530 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.260793 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.304551 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.456431 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6lqws"] Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.714726 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.715133 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.754529 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.935064 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:33:39 crc kubenswrapper[4773]: I0120 18:33:39.942860 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.297491 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.298428 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.343357 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.700289 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.700365 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.732568 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.901854 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-6lqws" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="registry-server" containerID="cri-o://cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b" gracePeriod=2 Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.938078 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:33:40 crc kubenswrapper[4773]: I0120 18:33:40.944041 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.791197 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.850462 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmbvt"] Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.908672 4773 generic.go:334] "Generic (PLEG): container finished" podID="a6759422-151d-4228-b7c7-848c3008fb52" containerID="cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b" exitCode=0 Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.908711 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6lqws" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.908748 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerDied","Data":"cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b"} Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.908829 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6lqws" event={"ID":"a6759422-151d-4228-b7c7-848c3008fb52","Type":"ContainerDied","Data":"c471e82ddefc652e5f04653639951233ffc24c82258453c65af9fd4352ec0b51"} Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.908856 4773 scope.go:117] "RemoveContainer" containerID="cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.909011 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wmbvt" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="registry-server" containerID="cri-o://04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f" gracePeriod=2 Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.927737 4773 scope.go:117] "RemoveContainer" containerID="884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.941965 4773 scope.go:117] "RemoveContainer" containerID="ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.963811 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-catalog-content\") pod \"a6759422-151d-4228-b7c7-848c3008fb52\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.963887 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-utilities\") pod \"a6759422-151d-4228-b7c7-848c3008fb52\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.964051 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkldm\" (UniqueName: \"kubernetes.io/projected/a6759422-151d-4228-b7c7-848c3008fb52-kube-api-access-pkldm\") pod \"a6759422-151d-4228-b7c7-848c3008fb52\" (UID: \"a6759422-151d-4228-b7c7-848c3008fb52\") " Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.965519 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-utilities" (OuterVolumeSpecName: "utilities") pod "a6759422-151d-4228-b7c7-848c3008fb52" (UID: "a6759422-151d-4228-b7c7-848c3008fb52"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:41 crc kubenswrapper[4773]: I0120 18:33:41.970529 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a6759422-151d-4228-b7c7-848c3008fb52-kube-api-access-pkldm" (OuterVolumeSpecName: "kube-api-access-pkldm") pod "a6759422-151d-4228-b7c7-848c3008fb52" (UID: "a6759422-151d-4228-b7c7-848c3008fb52"). InnerVolumeSpecName "kube-api-access-pkldm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.024497 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a6759422-151d-4228-b7c7-848c3008fb52" (UID: "a6759422-151d-4228-b7c7-848c3008fb52"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.056718 4773 scope.go:117] "RemoveContainer" containerID="cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b" Jan 20 18:33:42 crc kubenswrapper[4773]: E0120 18:33:42.057271 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b\": container with ID starting with cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b not found: ID does not exist" containerID="cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.057316 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b"} err="failed to get container status \"cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b\": rpc error: code = NotFound desc = could not find container \"cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b\": container with ID starting with cd1e29ae1efbd4411fe4beb5e062e802fa9108e66c529ebbc0ee76daeb99b70b not found: ID does not exist" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.057343 4773 scope.go:117] "RemoveContainer" containerID="884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138" Jan 20 18:33:42 crc kubenswrapper[4773]: E0120 18:33:42.057870 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138\": container with ID starting with 884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138 not found: ID does not exist" containerID="884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.057951 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138"} err="failed to get container status \"884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138\": rpc error: code = NotFound desc = could not find container \"884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138\": container with ID starting with 884fc85a8c28dd6ef2b5723787900c9ee03eec9f0e79faf98b5af3174b512138 not found: ID does not exist" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.057979 4773 scope.go:117] "RemoveContainer" containerID="ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d" Jan 20 18:33:42 crc kubenswrapper[4773]: E0120 18:33:42.058495 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d\": container with ID starting with ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d not found: ID does not exist" containerID="ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.058519 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d"} err="failed to get container status \"ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d\": rpc error: code = NotFound desc = could not find container \"ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d\": container with ID starting with ad0fe037ab9b571978fa707e8f0256764da9f94c8eebd275377cc7ce8fab608d not found: ID does not exist" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.065795 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.065833 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkldm\" (UniqueName: \"kubernetes.io/projected/a6759422-151d-4228-b7c7-848c3008fb52-kube-api-access-pkldm\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.065848 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a6759422-151d-4228-b7c7-848c3008fb52-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.233749 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-6lqws"] Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.239738 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-6lqws"] Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.300377 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.470209 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-catalog-content\") pod \"86872811-c0ef-45cc-949a-f88b07fca9b3\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.470383 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-utilities\") pod \"86872811-c0ef-45cc-949a-f88b07fca9b3\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.470578 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s9wnc\" (UniqueName: \"kubernetes.io/projected/86872811-c0ef-45cc-949a-f88b07fca9b3-kube-api-access-s9wnc\") pod \"86872811-c0ef-45cc-949a-f88b07fca9b3\" (UID: \"86872811-c0ef-45cc-949a-f88b07fca9b3\") " Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.471244 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-utilities" (OuterVolumeSpecName: "utilities") pod "86872811-c0ef-45cc-949a-f88b07fca9b3" (UID: "86872811-c0ef-45cc-949a-f88b07fca9b3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.473222 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86872811-c0ef-45cc-949a-f88b07fca9b3-kube-api-access-s9wnc" (OuterVolumeSpecName: "kube-api-access-s9wnc") pod "86872811-c0ef-45cc-949a-f88b07fca9b3" (UID: "86872811-c0ef-45cc-949a-f88b07fca9b3"). InnerVolumeSpecName "kube-api-access-s9wnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.495001 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "86872811-c0ef-45cc-949a-f88b07fca9b3" (UID: "86872811-c0ef-45cc-949a-f88b07fca9b3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.572404 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s9wnc\" (UniqueName: \"kubernetes.io/projected/86872811-c0ef-45cc-949a-f88b07fca9b3-kube-api-access-s9wnc\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.572438 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.572448 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/86872811-c0ef-45cc-949a-f88b07fca9b3-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.854650 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" podUID="cf25ec9b-96c5-4129-958f-35acbc34a20d" containerName="oauth-openshift" containerID="cri-o://87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc" gracePeriod=15 Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.917691 4773 generic.go:334] "Generic (PLEG): container finished" podID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerID="04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f" exitCode=0 Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.917872 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wmbvt" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.917873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmbvt" event={"ID":"86872811-c0ef-45cc-949a-f88b07fca9b3","Type":"ContainerDied","Data":"04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f"} Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.917941 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wmbvt" event={"ID":"86872811-c0ef-45cc-949a-f88b07fca9b3","Type":"ContainerDied","Data":"a50cda98715d09fbf90414a684934d7ed74aa3bda7711266d5d7491e5d03b79c"} Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.917961 4773 scope.go:117] "RemoveContainer" containerID="04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.930784 4773 scope.go:117] "RemoveContainer" containerID="cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.948047 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmbvt"] Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.948447 4773 scope.go:117] "RemoveContainer" containerID="0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.951357 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wmbvt"] Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.960490 4773 scope.go:117] "RemoveContainer" containerID="04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f" Jan 20 18:33:42 crc kubenswrapper[4773]: E0120 18:33:42.960884 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f\": container with ID starting with 04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f not found: ID does not exist" containerID="04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.960925 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f"} err="failed to get container status \"04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f\": rpc error: code = NotFound desc = could not find container \"04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f\": container with ID starting with 04e96ab730b30d52cb9070c2d14b8e4e2ea6e55ed01b501766b14e474b97b81f not found: ID does not exist" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.960965 4773 scope.go:117] "RemoveContainer" containerID="cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8" Jan 20 18:33:42 crc kubenswrapper[4773]: E0120 18:33:42.962210 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8\": container with ID starting with cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8 not found: ID does not exist" containerID="cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.962295 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8"} err="failed to get container status \"cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8\": rpc error: code = NotFound desc = could not find container \"cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8\": container with ID starting with cb2e18dd484e64c198ef9b647ddbbc153979cecb8616ee0286eaae28b1264df8 not found: ID does not exist" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.962368 4773 scope.go:117] "RemoveContainer" containerID="0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545" Jan 20 18:33:42 crc kubenswrapper[4773]: E0120 18:33:42.962847 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545\": container with ID starting with 0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545 not found: ID does not exist" containerID="0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545" Jan 20 18:33:42 crc kubenswrapper[4773]: I0120 18:33:42.962867 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545"} err="failed to get container status \"0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545\": rpc error: code = NotFound desc = could not find container \"0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545\": container with ID starting with 0ed7d4e03be8e4e7e30eec67a848d197be3f010293512ae29f5c3f5788e52545 not found: ID does not exist" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.456118 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" path="/var/lib/kubelet/pods/86872811-c0ef-45cc-949a-f88b07fca9b3/volumes" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.457376 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a6759422-151d-4228-b7c7-848c3008fb52" path="/var/lib/kubelet/pods/a6759422-151d-4228-b7c7-848c3008fb52/volumes" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.779686 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888044 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-ocp-branding-template\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888094 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-cliconfig\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888119 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-provider-selection\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888168 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p5ft\" (UniqueName: \"kubernetes.io/projected/cf25ec9b-96c5-4129-958f-35acbc34a20d-kube-api-access-4p5ft\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888186 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-dir\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888203 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-service-ca\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888303 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888236 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-policies\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888354 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-session\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888447 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-serving-cert\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888464 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-error\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888483 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-login\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888502 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-trusted-ca-bundle\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888524 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-idp-0-file-data\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888547 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-router-certs\") pod \"cf25ec9b-96c5-4129-958f-35acbc34a20d\" (UID: \"cf25ec9b-96c5-4129-958f-35acbc34a20d\") " Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.888735 4773 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.889718 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.889744 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.890072 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.890488 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.892322 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.893296 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf25ec9b-96c5-4129-958f-35acbc34a20d-kube-api-access-4p5ft" (OuterVolumeSpecName: "kube-api-access-4p5ft") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "kube-api-access-4p5ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.897047 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.897256 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.897408 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.897475 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.897590 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.897979 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.901165 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "cf25ec9b-96c5-4129-958f-35acbc34a20d" (UID: "cf25ec9b-96c5-4129-958f-35acbc34a20d"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.927693 4773 generic.go:334] "Generic (PLEG): container finished" podID="cf25ec9b-96c5-4129-958f-35acbc34a20d" containerID="87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc" exitCode=0 Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.927745 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.927793 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" event={"ID":"cf25ec9b-96c5-4129-958f-35acbc34a20d","Type":"ContainerDied","Data":"87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc"} Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.927854 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-xpsls" event={"ID":"cf25ec9b-96c5-4129-958f-35acbc34a20d","Type":"ContainerDied","Data":"4f519175ce1269f87277348e7dbf3cb7cac77cd634d740bc906a3ed7230ae289"} Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.927883 4773 scope.go:117] "RemoveContainer" containerID="87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.950476 4773 scope.go:117] "RemoveContainer" containerID="87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc" Jan 20 18:33:43 crc kubenswrapper[4773]: E0120 18:33:43.954534 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc\": container with ID starting with 87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc not found: ID does not exist" containerID="87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.954572 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc"} err="failed to get container status \"87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc\": rpc error: code = NotFound desc = could not find container \"87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc\": container with ID starting with 87cd7110ca4c65f7ea756d111fc7910caeaf8153a86ed7a2e3928b2f034f84bc not found: ID does not exist" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.958859 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xpsls"] Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.961860 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-xpsls"] Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989560 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989590 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p5ft\" (UniqueName: \"kubernetes.io/projected/cf25ec9b-96c5-4129-958f-35acbc34a20d-kube-api-access-4p5ft\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989604 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989614 4773 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989625 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989635 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989644 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989657 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989667 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989676 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989685 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989694 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:43 crc kubenswrapper[4773]: I0120 18:33:43.989703 4773 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/cf25ec9b-96c5-4129-958f-35acbc34a20d-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.254082 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxsfk"] Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.255000 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-kxsfk" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="registry-server" containerID="cri-o://2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745" gracePeriod=2 Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.703301 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.799660 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-utilities\") pod \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.799755 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-catalog-content\") pod \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.799785 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-txldk\" (UniqueName: \"kubernetes.io/projected/c19dbd84-8fec-4998-b2ae-65c68dee6b17-kube-api-access-txldk\") pod \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\" (UID: \"c19dbd84-8fec-4998-b2ae-65c68dee6b17\") " Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.800727 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-utilities" (OuterVolumeSpecName: "utilities") pod "c19dbd84-8fec-4998-b2ae-65c68dee6b17" (UID: "c19dbd84-8fec-4998-b2ae-65c68dee6b17"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.803982 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c19dbd84-8fec-4998-b2ae-65c68dee6b17-kube-api-access-txldk" (OuterVolumeSpecName: "kube-api-access-txldk") pod "c19dbd84-8fec-4998-b2ae-65c68dee6b17" (UID: "c19dbd84-8fec-4998-b2ae-65c68dee6b17"). InnerVolumeSpecName "kube-api-access-txldk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.901286 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-txldk\" (UniqueName: \"kubernetes.io/projected/c19dbd84-8fec-4998-b2ae-65c68dee6b17-kube-api-access-txldk\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.901315 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.913199 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c19dbd84-8fec-4998-b2ae-65c68dee6b17" (UID: "c19dbd84-8fec-4998-b2ae-65c68dee6b17"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.933965 4773 generic.go:334] "Generic (PLEG): container finished" podID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerID="2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745" exitCode=0 Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.934039 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerDied","Data":"2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745"} Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.934107 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kxsfk" event={"ID":"c19dbd84-8fec-4998-b2ae-65c68dee6b17","Type":"ContainerDied","Data":"c69c9f7c333b88336518be53c3fff5b901d87e988f1ab9a73e541d27f61cbb78"} Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.934126 4773 scope.go:117] "RemoveContainer" containerID="2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.934438 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kxsfk" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.950179 4773 scope.go:117] "RemoveContainer" containerID="2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.964202 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-kxsfk"] Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.967606 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-kxsfk"] Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.986180 4773 scope.go:117] "RemoveContainer" containerID="7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.997380 4773 scope.go:117] "RemoveContainer" containerID="2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745" Jan 20 18:33:44 crc kubenswrapper[4773]: E0120 18:33:44.998366 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745\": container with ID starting with 2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745 not found: ID does not exist" containerID="2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.998423 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745"} err="failed to get container status \"2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745\": rpc error: code = NotFound desc = could not find container \"2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745\": container with ID starting with 2dea0ae94ea0def3f88197b36ec9f2f15750ea1a3dac4ecfaba57a973830e745 not found: ID does not exist" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.998469 4773 scope.go:117] "RemoveContainer" containerID="2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088" Jan 20 18:33:44 crc kubenswrapper[4773]: E0120 18:33:44.998758 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088\": container with ID starting with 2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088 not found: ID does not exist" containerID="2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.998784 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088"} err="failed to get container status \"2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088\": rpc error: code = NotFound desc = could not find container \"2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088\": container with ID starting with 2224e4d25524fddc1486e82299478cff3e69b71640d782d27370469785e93088 not found: ID does not exist" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.998799 4773 scope.go:117] "RemoveContainer" containerID="7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3" Jan 20 18:33:44 crc kubenswrapper[4773]: E0120 18:33:44.999037 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3\": container with ID starting with 7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3 not found: ID does not exist" containerID="7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3" Jan 20 18:33:44 crc kubenswrapper[4773]: I0120 18:33:44.999067 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3"} err="failed to get container status \"7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3\": rpc error: code = NotFound desc = could not find container \"7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3\": container with ID starting with 7b03e6ef28eada3e366c7e455756b7d0d82d0920656be25f02d2e3fd00ca9cd3 not found: ID does not exist" Jan 20 18:33:45 crc kubenswrapper[4773]: I0120 18:33:45.002694 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c19dbd84-8fec-4998-b2ae-65c68dee6b17-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:45 crc kubenswrapper[4773]: I0120 18:33:45.456242 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" path="/var/lib/kubelet/pods/c19dbd84-8fec-4998-b2ae-65c68dee6b17/volumes" Jan 20 18:33:45 crc kubenswrapper[4773]: I0120 18:33:45.457788 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf25ec9b-96c5-4129-958f-35acbc34a20d" path="/var/lib/kubelet/pods/cf25ec9b-96c5-4129-958f-35acbc34a20d/volumes" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610075 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-748578cd96-z5t98"] Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610745 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610757 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610768 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf25ec9b-96c5-4129-958f-35acbc34a20d" containerName="oauth-openshift" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610774 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf25ec9b-96c5-4129-958f-35acbc34a20d" containerName="oauth-openshift" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610788 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610794 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610804 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="extract-content" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610809 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="extract-content" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610818 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="extract-utilities" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610823 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="extract-utilities" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610831 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="extract-content" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610836 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="extract-content" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610842 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="extract-content" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610849 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="extract-content" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610858 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="extract-utilities" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610864 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="extract-utilities" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610873 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="extract-utilities" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610878 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="extract-utilities" Jan 20 18:33:53 crc kubenswrapper[4773]: E0120 18:33:53.610887 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610892 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.610998 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="86872811-c0ef-45cc-949a-f88b07fca9b3" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.611018 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf25ec9b-96c5-4129-958f-35acbc34a20d" containerName="oauth-openshift" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.611027 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a6759422-151d-4228-b7c7-848c3008fb52" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.611037 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c19dbd84-8fec-4998-b2ae-65c68dee6b17" containerName="registry-server" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.611525 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.614341 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.614405 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.614355 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.614844 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.615047 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.615084 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.615232 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.615438 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.615613 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.616481 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.616731 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.619356 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.627147 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-748578cd96-z5t98"] Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.639842 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.639869 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.648516 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714428 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714483 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-service-ca\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714521 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dcdf6037-c51f-4824-8591-fd1c8d53f086-audit-dir\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714544 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-audit-policies\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714652 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhqx6\" (UniqueName: \"kubernetes.io/projected/dcdf6037-c51f-4824-8591-fd1c8d53f086-kube-api-access-fhqx6\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714719 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-cliconfig\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714766 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-error\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714798 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-login\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714825 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-serving-cert\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714851 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714886 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-router-certs\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.714971 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.715029 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-session\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.715066 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.802438 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6549f94c47-dcb2l"] Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.802674 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" podUID="12c12fbe-31ec-4b46-ace6-ac3451850070" containerName="controller-manager" containerID="cri-o://cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9" gracePeriod=30 Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816160 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhqx6\" (UniqueName: \"kubernetes.io/projected/dcdf6037-c51f-4824-8591-fd1c8d53f086-kube-api-access-fhqx6\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816221 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-cliconfig\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816248 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-error\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816274 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-login\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816304 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-serving-cert\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816331 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816360 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-router-certs\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816383 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816406 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-session\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816428 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816458 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816483 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-service-ca\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816516 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dcdf6037-c51f-4824-8591-fd1c8d53f086-audit-dir\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.816539 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-audit-policies\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.817306 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-audit-policies\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.817532 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/dcdf6037-c51f-4824-8591-fd1c8d53f086-audit-dir\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.817715 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-cliconfig\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.818200 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-service-ca\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.818727 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.821727 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-error\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.822175 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-router-certs\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.825053 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.827695 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-session\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.828120 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-login\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.835363 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.835576 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.841403 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/dcdf6037-c51f-4824-8591-fd1c8d53f086-v4-0-config-system-serving-cert\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.851258 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhqx6\" (UniqueName: \"kubernetes.io/projected/dcdf6037-c51f-4824-8591-fd1c8d53f086-kube-api-access-fhqx6\") pod \"oauth-openshift-748578cd96-z5t98\" (UID: \"dcdf6037-c51f-4824-8591-fd1c8d53f086\") " pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.906792 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2"] Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.907014 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" podUID="d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" containerName="route-controller-manager" containerID="cri-o://6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660" gracePeriod=30 Jan 20 18:33:53 crc kubenswrapper[4773]: I0120 18:33:53.930370 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.426786 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-748578cd96-z5t98"] Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.790068 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.820147 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.930747 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9wrq\" (UniqueName: \"kubernetes.io/projected/12c12fbe-31ec-4b46-ace6-ac3451850070-kube-api-access-h9wrq\") pod \"12c12fbe-31ec-4b46-ace6-ac3451850070\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.930809 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-config\") pod \"12c12fbe-31ec-4b46-ace6-ac3451850070\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.930841 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckld2\" (UniqueName: \"kubernetes.io/projected/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-kube-api-access-ckld2\") pod \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.930862 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-proxy-ca-bundles\") pod \"12c12fbe-31ec-4b46-ace6-ac3451850070\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.930896 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12c12fbe-31ec-4b46-ace6-ac3451850070-serving-cert\") pod \"12c12fbe-31ec-4b46-ace6-ac3451850070\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.930949 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-client-ca\") pod \"12c12fbe-31ec-4b46-ace6-ac3451850070\" (UID: \"12c12fbe-31ec-4b46-ace6-ac3451850070\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.931013 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-config\") pod \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.931044 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-client-ca\") pod \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.931078 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-serving-cert\") pod \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\" (UID: \"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44\") " Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.933096 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-client-ca" (OuterVolumeSpecName: "client-ca") pod "12c12fbe-31ec-4b46-ace6-ac3451850070" (UID: "12c12fbe-31ec-4b46-ace6-ac3451850070"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.933207 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-config" (OuterVolumeSpecName: "config") pod "12c12fbe-31ec-4b46-ace6-ac3451850070" (UID: "12c12fbe-31ec-4b46-ace6-ac3451850070"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.933467 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-client-ca" (OuterVolumeSpecName: "client-ca") pod "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" (UID: "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.933513 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-config" (OuterVolumeSpecName: "config") pod "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" (UID: "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.933601 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "12c12fbe-31ec-4b46-ace6-ac3451850070" (UID: "12c12fbe-31ec-4b46-ace6-ac3451850070"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.936896 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" (UID: "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.937039 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12c12fbe-31ec-4b46-ace6-ac3451850070-kube-api-access-h9wrq" (OuterVolumeSpecName: "kube-api-access-h9wrq") pod "12c12fbe-31ec-4b46-ace6-ac3451850070" (UID: "12c12fbe-31ec-4b46-ace6-ac3451850070"). InnerVolumeSpecName "kube-api-access-h9wrq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.939616 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12c12fbe-31ec-4b46-ace6-ac3451850070-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "12c12fbe-31ec-4b46-ace6-ac3451850070" (UID: "12c12fbe-31ec-4b46-ace6-ac3451850070"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.944062 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-kube-api-access-ckld2" (OuterVolumeSpecName: "kube-api-access-ckld2") pod "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" (UID: "d65ab1d5-14b1-4f38-a627-ca6f00bb0b44"). InnerVolumeSpecName "kube-api-access-ckld2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.986040 4773 generic.go:334] "Generic (PLEG): container finished" podID="12c12fbe-31ec-4b46-ace6-ac3451850070" containerID="cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9" exitCode=0 Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.986423 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" event={"ID":"12c12fbe-31ec-4b46-ace6-ac3451850070","Type":"ContainerDied","Data":"cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9"} Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.986449 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" event={"ID":"12c12fbe-31ec-4b46-ace6-ac3451850070","Type":"ContainerDied","Data":"0d0c05a7620d0b2d084778129f9283d19b008663b280d13e07e0f66abacb3b84"} Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.986467 4773 scope.go:117] "RemoveContainer" containerID="cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.986560 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6549f94c47-dcb2l" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.989851 4773 generic.go:334] "Generic (PLEG): container finished" podID="d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" containerID="6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660" exitCode=0 Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.989961 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.989921 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" event={"ID":"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44","Type":"ContainerDied","Data":"6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660"} Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.990091 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2" event={"ID":"d65ab1d5-14b1-4f38-a627-ca6f00bb0b44","Type":"ContainerDied","Data":"c1527878b5d63bc877a41560d037696931df58e57b3c0379b8a0a0fa203a1634"} Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.991321 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" event={"ID":"dcdf6037-c51f-4824-8591-fd1c8d53f086","Type":"ContainerStarted","Data":"1d7df637dc9d820da1a692567e2778f8e8028898d08aa8c07ada286db74871dc"} Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.991349 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" event={"ID":"dcdf6037-c51f-4824-8591-fd1c8d53f086","Type":"ContainerStarted","Data":"99eb80dcf4e4a84dbd0b1fb412df39cdd2792089b7a0eddb39eb6f92fe6882e7"} Jan 20 18:33:54 crc kubenswrapper[4773]: I0120 18:33:54.992007 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.005665 4773 scope.go:117] "RemoveContainer" containerID="cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9" Jan 20 18:33:55 crc kubenswrapper[4773]: E0120 18:33:55.006969 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9\": container with ID starting with cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9 not found: ID does not exist" containerID="cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.007036 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9"} err="failed to get container status \"cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9\": rpc error: code = NotFound desc = could not find container \"cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9\": container with ID starting with cbbb8755d258b67f8bd177e191acc5b63c630dfd818ae558026c74b2112af5e9 not found: ID does not exist" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.007066 4773 scope.go:117] "RemoveContainer" containerID="6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.014104 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" podStartSLOduration=38.014088259 podStartE2EDuration="38.014088259s" podCreationTimestamp="2026-01-20 18:33:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:55.01179781 +0000 UTC m=+227.933610844" watchObservedRunningTime="2026-01-20 18:33:55.014088259 +0000 UTC m=+227.935901283" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.026855 4773 scope.go:117] "RemoveContainer" containerID="6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660" Jan 20 18:33:55 crc kubenswrapper[4773]: E0120 18:33:55.027293 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660\": container with ID starting with 6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660 not found: ID does not exist" containerID="6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.027331 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660"} err="failed to get container status \"6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660\": rpc error: code = NotFound desc = could not find container \"6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660\": container with ID starting with 6f53468db68f0afdfbd68160f14dc3314340b8339f575bbfbcd97e069ad06660 not found: ID does not exist" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033081 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033115 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h9wrq\" (UniqueName: \"kubernetes.io/projected/12c12fbe-31ec-4b46-ace6-ac3451850070-kube-api-access-h9wrq\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033127 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033136 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckld2\" (UniqueName: \"kubernetes.io/projected/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-kube-api-access-ckld2\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033145 4773 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033155 4773 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/12c12fbe-31ec-4b46-ace6-ac3451850070-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033164 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/12c12fbe-31ec-4b46-ace6-ac3451850070-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033174 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.033183 4773 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44-client-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.036449 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.040584 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9cd584848-gxtc2"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.043488 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6549f94c47-dcb2l"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.045701 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6549f94c47-dcb2l"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.090807 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-748578cd96-z5t98" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.454264 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12c12fbe-31ec-4b46-ace6-ac3451850070" path="/var/lib/kubelet/pods/12c12fbe-31ec-4b46-ace6-ac3451850070/volumes" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.454895 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" path="/var/lib/kubelet/pods/d65ab1d5-14b1-4f38-a627-ca6f00bb0b44/volumes" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.614531 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f7578c85f-8k8g4"] Jan 20 18:33:55 crc kubenswrapper[4773]: E0120 18:33:55.614725 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" containerName="route-controller-manager" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.614736 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" containerName="route-controller-manager" Jan 20 18:33:55 crc kubenswrapper[4773]: E0120 18:33:55.614754 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="12c12fbe-31ec-4b46-ace6-ac3451850070" containerName="controller-manager" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.614759 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="12c12fbe-31ec-4b46-ace6-ac3451850070" containerName="controller-manager" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.614842 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="12c12fbe-31ec-4b46-ace6-ac3451850070" containerName="controller-manager" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.614854 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d65ab1d5-14b1-4f38-a627-ca6f00bb0b44" containerName="route-controller-manager" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.615213 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.622415 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.622469 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.622478 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.623531 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.623646 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.624192 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.627361 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.633830 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.635013 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.640449 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.640818 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.640822 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.640521 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.641107 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.641248 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.641773 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f7578c85f-8k8g4"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.647987 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9"] Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743162 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-client-ca\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743226 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76f300cc-6496-42e7-84ea-d542b110a9a7-serving-cert\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743256 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d1c7708-bc85-43d7-ab64-9b2b99a43557-serving-cert\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743282 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-config\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743572 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d1c7708-bc85-43d7-ab64-9b2b99a43557-client-ca\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-proxy-ca-bundles\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743632 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1c7708-bc85-43d7-ab64-9b2b99a43557-config\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743681 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjt9d\" (UniqueName: \"kubernetes.io/projected/76f300cc-6496-42e7-84ea-d542b110a9a7-kube-api-access-tjt9d\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.743748 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98sgx\" (UniqueName: \"kubernetes.io/projected/0d1c7708-bc85-43d7-ab64-9b2b99a43557-kube-api-access-98sgx\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.844912 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1c7708-bc85-43d7-ab64-9b2b99a43557-config\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.844972 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tjt9d\" (UniqueName: \"kubernetes.io/projected/76f300cc-6496-42e7-84ea-d542b110a9a7-kube-api-access-tjt9d\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845001 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98sgx\" (UniqueName: \"kubernetes.io/projected/0d1c7708-bc85-43d7-ab64-9b2b99a43557-kube-api-access-98sgx\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845024 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-client-ca\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845051 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76f300cc-6496-42e7-84ea-d542b110a9a7-serving-cert\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845071 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d1c7708-bc85-43d7-ab64-9b2b99a43557-serving-cert\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845094 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-config\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845115 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d1c7708-bc85-43d7-ab64-9b2b99a43557-client-ca\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.845133 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-proxy-ca-bundles\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.846168 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-client-ca\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.846217 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/0d1c7708-bc85-43d7-ab64-9b2b99a43557-client-ca\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.846228 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-proxy-ca-bundles\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.846362 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0d1c7708-bc85-43d7-ab64-9b2b99a43557-config\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.846565 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/76f300cc-6496-42e7-84ea-d542b110a9a7-config\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.849597 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0d1c7708-bc85-43d7-ab64-9b2b99a43557-serving-cert\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.852522 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/76f300cc-6496-42e7-84ea-d542b110a9a7-serving-cert\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.865263 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjt9d\" (UniqueName: \"kubernetes.io/projected/76f300cc-6496-42e7-84ea-d542b110a9a7-kube-api-access-tjt9d\") pod \"controller-manager-f7578c85f-8k8g4\" (UID: \"76f300cc-6496-42e7-84ea-d542b110a9a7\") " pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.875725 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98sgx\" (UniqueName: \"kubernetes.io/projected/0d1c7708-bc85-43d7-ab64-9b2b99a43557-kube-api-access-98sgx\") pod \"route-controller-manager-7cd86d444d-78rr9\" (UID: \"0d1c7708-bc85-43d7-ab64-9b2b99a43557\") " pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.939604 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:55 crc kubenswrapper[4773]: I0120 18:33:55.949744 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:56 crc kubenswrapper[4773]: I0120 18:33:56.376621 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9"] Jan 20 18:33:56 crc kubenswrapper[4773]: I0120 18:33:56.381292 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f7578c85f-8k8g4"] Jan 20 18:33:56 crc kubenswrapper[4773]: W0120 18:33:56.387909 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76f300cc_6496_42e7_84ea_d542b110a9a7.slice/crio-e3b1612f40311f6577400f6f1470d7622847a572a44e52d4c622eee0290a10f2 WatchSource:0}: Error finding container e3b1612f40311f6577400f6f1470d7622847a572a44e52d4c622eee0290a10f2: Status 404 returned error can't find the container with id e3b1612f40311f6577400f6f1470d7622847a572a44e52d4c622eee0290a10f2 Jan 20 18:33:56 crc kubenswrapper[4773]: W0120 18:33:56.395002 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d1c7708_bc85_43d7_ab64_9b2b99a43557.slice/crio-612d9feb2e32bb1e1fe239e9aef6e75b4b35595c64123eff56239d97e8beb615 WatchSource:0}: Error finding container 612d9feb2e32bb1e1fe239e9aef6e75b4b35595c64123eff56239d97e8beb615: Status 404 returned error can't find the container with id 612d9feb2e32bb1e1fe239e9aef6e75b4b35595c64123eff56239d97e8beb615 Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.007316 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" event={"ID":"76f300cc-6496-42e7-84ea-d542b110a9a7","Type":"ContainerStarted","Data":"fc12f458c70ba48630ffe807737912e0cb47e903d0ec8f933085e5c2e0706a3b"} Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.007756 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.007794 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" event={"ID":"76f300cc-6496-42e7-84ea-d542b110a9a7","Type":"ContainerStarted","Data":"e3b1612f40311f6577400f6f1470d7622847a572a44e52d4c622eee0290a10f2"} Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.009161 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" event={"ID":"0d1c7708-bc85-43d7-ab64-9b2b99a43557","Type":"ContainerStarted","Data":"82f8263c71807bdbfb6056daae4d133d9aca910f95392ba71950410bf787b172"} Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.009205 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" event={"ID":"0d1c7708-bc85-43d7-ab64-9b2b99a43557","Type":"ContainerStarted","Data":"612d9feb2e32bb1e1fe239e9aef6e75b4b35595c64123eff56239d97e8beb615"} Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.009455 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.015031 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.016597 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.052768 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f7578c85f-8k8g4" podStartSLOduration=4.052743328 podStartE2EDuration="4.052743328s" podCreationTimestamp="2026-01-20 18:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:57.032548431 +0000 UTC m=+229.954361465" watchObservedRunningTime="2026-01-20 18:33:57.052743328 +0000 UTC m=+229.974556352" Jan 20 18:33:57 crc kubenswrapper[4773]: I0120 18:33:57.089609 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7cd86d444d-78rr9" podStartSLOduration=4.0895873 podStartE2EDuration="4.0895873s" podCreationTimestamp="2026-01-20 18:33:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:33:57.054706858 +0000 UTC m=+229.976519882" watchObservedRunningTime="2026-01-20 18:33:57.0895873 +0000 UTC m=+230.011400324" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.026271 4773 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027349 4773 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027477 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027666 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7" gracePeriod=15 Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027768 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b" gracePeriod=15 Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027782 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1" gracePeriod=15 Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027864 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0" gracePeriod=15 Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.027843 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da" gracePeriod=15 Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028232 4773 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028379 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028398 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028409 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028417 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028428 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028435 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028444 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028451 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028465 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028473 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028484 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028494 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 20 18:34:08 crc kubenswrapper[4773]: E0120 18:34:08.028503 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028510 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028617 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028630 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028637 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028646 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028657 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.028845 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099534 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099625 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099647 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099666 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099687 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099711 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.099729 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200439 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200485 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200518 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200550 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200556 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200587 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200613 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200647 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200675 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200665 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200709 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200759 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200730 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:08 crc kubenswrapper[4773]: I0120 18:34:08.200821 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.009174 4773 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.009639 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.084476 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.086338 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.087177 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da" exitCode=0 Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.087233 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0" exitCode=0 Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.087249 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1" exitCode=0 Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.087263 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b" exitCode=2 Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.087300 4773 scope.go:117] "RemoveContainer" containerID="fab6fab567cc35a379572c4ea5fb0f9472f3634fcede213202390e12c5cf6f2f" Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.089996 4773 generic.go:334] "Generic (PLEG): container finished" podID="29920243-d87d-49b3-9215-680935300c6e" containerID="87629fae04c4c3f2e756aceedc76b8a1d45cb15dd8a81bfaef042af220c86cad" exitCode=0 Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.090075 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29920243-d87d-49b3-9215-680935300c6e","Type":"ContainerDied","Data":"87629fae04c4c3f2e756aceedc76b8a1d45cb15dd8a81bfaef042af220c86cad"} Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.091451 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:09 crc kubenswrapper[4773]: I0120 18:34:09.091889 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.127343 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.459412 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.460861 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.545527 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-kubelet-dir\") pod \"29920243-d87d-49b3-9215-680935300c6e\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.545609 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29920243-d87d-49b3-9215-680935300c6e-kube-api-access\") pod \"29920243-d87d-49b3-9215-680935300c6e\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.545655 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-var-lock\") pod \"29920243-d87d-49b3-9215-680935300c6e\" (UID: \"29920243-d87d-49b3-9215-680935300c6e\") " Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.545649 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "29920243-d87d-49b3-9215-680935300c6e" (UID: "29920243-d87d-49b3-9215-680935300c6e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.545947 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-var-lock" (OuterVolumeSpecName: "var-lock") pod "29920243-d87d-49b3-9215-680935300c6e" (UID: "29920243-d87d-49b3-9215-680935300c6e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.546069 4773 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.546124 4773 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/29920243-d87d-49b3-9215-680935300c6e-var-lock\") on node \"crc\" DevicePath \"\"" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.554520 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29920243-d87d-49b3-9215-680935300c6e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "29920243-d87d-49b3-9215-680935300c6e" (UID: "29920243-d87d-49b3-9215-680935300c6e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.647252 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29920243-d87d-49b3-9215-680935300c6e-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.884268 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.884970 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.885896 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.886132 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949556 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949590 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949645 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949675 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949726 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949820 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.949993 4773 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.950008 4773 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:34:10 crc kubenswrapper[4773]: I0120 18:34:10.950017 4773 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.136688 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"29920243-d87d-49b3-9215-680935300c6e","Type":"ContainerDied","Data":"425d04b8ab5f331609cc0a966bf16d14189893447909935dd72ec164f756a917"} Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.136718 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.136737 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="425d04b8ab5f331609cc0a966bf16d14189893447909935dd72ec164f756a917" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.139294 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.139853 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7" exitCode=0 Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.139901 4773 scope.go:117] "RemoveContainer" containerID="c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.139978 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.152609 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.152996 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.154059 4773 scope.go:117] "RemoveContainer" containerID="06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.160127 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.161983 4773 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.168665 4773 scope.go:117] "RemoveContainer" containerID="af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.184247 4773 scope.go:117] "RemoveContainer" containerID="05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.201254 4773 scope.go:117] "RemoveContainer" containerID="13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.217497 4773 scope.go:117] "RemoveContainer" containerID="598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.237327 4773 scope.go:117] "RemoveContainer" containerID="c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.237849 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\": container with ID starting with c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da not found: ID does not exist" containerID="c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.237956 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da"} err="failed to get container status \"c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\": rpc error: code = NotFound desc = could not find container \"c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da\": container with ID starting with c1a1628ccb969f4cdde3babedb802c79d7bba385260dcefc8140ed674fb423da not found: ID does not exist" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.237996 4773 scope.go:117] "RemoveContainer" containerID="06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.238437 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\": container with ID starting with 06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0 not found: ID does not exist" containerID="06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.238468 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0"} err="failed to get container status \"06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\": rpc error: code = NotFound desc = could not find container \"06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0\": container with ID starting with 06ac78bba5a5a697d2246518bb04b7503b37c2fafd2369e5826dd02b467715d0 not found: ID does not exist" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.238487 4773 scope.go:117] "RemoveContainer" containerID="af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.238791 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\": container with ID starting with af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1 not found: ID does not exist" containerID="af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.238859 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1"} err="failed to get container status \"af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\": rpc error: code = NotFound desc = could not find container \"af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1\": container with ID starting with af183a195b31fc8b191dc75db28863447424fb2763e1f51a5f37ce78c4b046e1 not found: ID does not exist" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.238897 4773 scope.go:117] "RemoveContainer" containerID="05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.239222 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\": container with ID starting with 05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b not found: ID does not exist" containerID="05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.239270 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b"} err="failed to get container status \"05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\": rpc error: code = NotFound desc = could not find container \"05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b\": container with ID starting with 05f5db3e67f63a4ed226186eec7eebf95687a59c3d742a7d29b2be3f99543a0b not found: ID does not exist" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.239303 4773 scope.go:117] "RemoveContainer" containerID="13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.239578 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\": container with ID starting with 13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7 not found: ID does not exist" containerID="13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.239626 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7"} err="failed to get container status \"13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\": rpc error: code = NotFound desc = could not find container \"13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7\": container with ID starting with 13ef504628d3252d01c26801cf91a4fabdfbd5d8a78e60e9666beafa709816a7 not found: ID does not exist" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.239657 4773 scope.go:117] "RemoveContainer" containerID="598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.240291 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\": container with ID starting with 598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554 not found: ID does not exist" containerID="598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.240376 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554"} err="failed to get container status \"598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\": rpc error: code = NotFound desc = could not find container \"598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554\": container with ID starting with 598d4fbb56c343dda90581bcc02c3e727ee9876662924585010f5fa93ee1d554 not found: ID does not exist" Jan 20 18:34:11 crc kubenswrapper[4773]: I0120 18:34:11.460359 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 20 18:34:11 crc kubenswrapper[4773]: E0120 18:34:11.507886 4773 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" volumeName="registry-storage" Jan 20 18:34:13 crc kubenswrapper[4773]: E0120 18:34:13.063659 4773 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:13 crc kubenswrapper[4773]: I0120 18:34:13.064407 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:13 crc kubenswrapper[4773]: W0120 18:34:13.089251 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-29d282612df47d223719541924bebd181432855fdfa2ae4c95a906916f8fdaaa WatchSource:0}: Error finding container 29d282612df47d223719541924bebd181432855fdfa2ae4c95a906916f8fdaaa: Status 404 returned error can't find the container with id 29d282612df47d223719541924bebd181432855fdfa2ae4c95a906916f8fdaaa Jan 20 18:34:13 crc kubenswrapper[4773]: E0120 18:34:13.092397 4773 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.39:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188c842629a427ed openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 18:34:13.092009965 +0000 UTC m=+246.013822989,LastTimestamp:2026-01-20 18:34:13.092009965 +0000 UTC m=+246.013822989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 18:34:13 crc kubenswrapper[4773]: I0120 18:34:13.152827 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"29d282612df47d223719541924bebd181432855fdfa2ae4c95a906916f8fdaaa"} Jan 20 18:34:14 crc kubenswrapper[4773]: I0120 18:34:14.159382 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"ead94a7d10d1f90fc4310de40534fc1abd743b52e61d834ccd031ce2d9f74b37"} Jan 20 18:34:14 crc kubenswrapper[4773]: E0120 18:34:14.160511 4773 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:14 crc kubenswrapper[4773]: I0120 18:34:14.160540 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:15 crc kubenswrapper[4773]: E0120 18:34:15.163993 4773 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:34:16 crc kubenswrapper[4773]: E0120 18:34:16.921679 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:16 crc kubenswrapper[4773]: E0120 18:34:16.922844 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:16 crc kubenswrapper[4773]: E0120 18:34:16.923441 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:16 crc kubenswrapper[4773]: E0120 18:34:16.923866 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:16 crc kubenswrapper[4773]: E0120 18:34:16.924179 4773 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:16 crc kubenswrapper[4773]: I0120 18:34:16.924216 4773 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 20 18:34:16 crc kubenswrapper[4773]: E0120 18:34:16.924456 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="200ms" Jan 20 18:34:17 crc kubenswrapper[4773]: E0120 18:34:17.125910 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="400ms" Jan 20 18:34:17 crc kubenswrapper[4773]: I0120 18:34:17.448893 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:17 crc kubenswrapper[4773]: E0120 18:34:17.528474 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="800ms" Jan 20 18:34:18 crc kubenswrapper[4773]: E0120 18:34:18.329684 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="1.6s" Jan 20 18:34:19 crc kubenswrapper[4773]: E0120 18:34:19.931141 4773 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.39:6443: connect: connection refused" interval="3.2s" Jan 20 18:34:20 crc kubenswrapper[4773]: E0120 18:34:20.917239 4773 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.39:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188c842629a427ed openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-20 18:34:13.092009965 +0000 UTC m=+246.013822989,LastTimestamp:2026-01-20 18:34:13.092009965 +0000 UTC m=+246.013822989,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 20 18:34:21 crc kubenswrapper[4773]: I0120 18:34:21.446301 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:21 crc kubenswrapper[4773]: I0120 18:34:21.447176 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:21 crc kubenswrapper[4773]: I0120 18:34:21.467336 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:21 crc kubenswrapper[4773]: I0120 18:34:21.467407 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:21 crc kubenswrapper[4773]: E0120 18:34:21.468317 4773 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:21 crc kubenswrapper[4773]: I0120 18:34:21.468856 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.300294 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.300573 4773 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3" exitCode=1 Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.300626 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3"} Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.301108 4773 scope.go:117] "RemoveContainer" containerID="c5be06527f116faac5a7ce7b2df804239b382a394e11c56eee69fa24739faab3" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.301893 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.302121 4773 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.304493 4773 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="1f6e095fd190521be17fa722c72a1c13cc689ee9387657247cc7961515edf235" exitCode=0 Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.304526 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"1f6e095fd190521be17fa722c72a1c13cc689ee9387657247cc7961515edf235"} Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.304547 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7f1b5519ab31867be76cc5e32ab6f786b949224ff11821a9e64b396b0ced0244"} Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.304758 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.304780 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:22 crc kubenswrapper[4773]: E0120 18:34:22.305307 4773 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.305455 4773 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:22 crc kubenswrapper[4773]: I0120 18:34:22.306040 4773 status_manager.go:851] "Failed to get status for pod" podUID="29920243-d87d-49b3-9215-680935300c6e" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.39:6443: connect: connection refused" Jan 20 18:34:23 crc kubenswrapper[4773]: I0120 18:34:23.316836 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7bae06e1d36b5c1a48d7b03dcb92788dafc5bbc46f28fde6fe206552481ca182"} Jan 20 18:34:23 crc kubenswrapper[4773]: I0120 18:34:23.317312 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"175c4314888c1cfd7476913515672d64fea343cf7e42b8c235738032ed3bb660"} Jan 20 18:34:23 crc kubenswrapper[4773]: I0120 18:34:23.317342 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"0ec4d23a83fd229f96b0110e5b6f9775e656e764687107efc5fb3a550b539527"} Jan 20 18:34:23 crc kubenswrapper[4773]: I0120 18:34:23.317358 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"948d8449b8652d87d8f3bd7c7e0012c9110f918045322ea2dfdeef12d9349dbf"} Jan 20 18:34:23 crc kubenswrapper[4773]: I0120 18:34:23.324771 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 20 18:34:23 crc kubenswrapper[4773]: I0120 18:34:23.324837 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fcf5c8acfe82bdbf72ee4e18bd54560aec372bb9e0cd0499181010c00caac3f3"} Jan 20 18:34:24 crc kubenswrapper[4773]: I0120 18:34:24.333957 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"da815e130dc9b7ec0d00e5842d09e74606a90648632c926e4885b156e56f5cae"} Jan 20 18:34:24 crc kubenswrapper[4773]: I0120 18:34:24.334208 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:24 crc kubenswrapper[4773]: I0120 18:34:24.334410 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:24 crc kubenswrapper[4773]: I0120 18:34:24.334448 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:26 crc kubenswrapper[4773]: I0120 18:34:26.469183 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:26 crc kubenswrapper[4773]: I0120 18:34:26.469492 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:26 crc kubenswrapper[4773]: I0120 18:34:26.477066 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:26 crc kubenswrapper[4773]: I0120 18:34:26.756261 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:34:27 crc kubenswrapper[4773]: I0120 18:34:27.432287 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:34:27 crc kubenswrapper[4773]: I0120 18:34:27.439417 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:34:29 crc kubenswrapper[4773]: I0120 18:34:29.345072 4773 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:29 crc kubenswrapper[4773]: I0120 18:34:29.361145 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:29 crc kubenswrapper[4773]: I0120 18:34:29.361189 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:29 crc kubenswrapper[4773]: I0120 18:34:29.365468 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:29 crc kubenswrapper[4773]: I0120 18:34:29.368772 4773 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bb237516-46e4-442a-9643-74c0fbd4b05a" Jan 20 18:34:30 crc kubenswrapper[4773]: I0120 18:34:30.365080 4773 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:30 crc kubenswrapper[4773]: I0120 18:34:30.365110 4773 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="c9b329af-a6e2-4ba8-b70d-f1ad0cd67671" Jan 20 18:34:36 crc kubenswrapper[4773]: I0120 18:34:36.760010 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 20 18:34:37 crc kubenswrapper[4773]: I0120 18:34:37.455207 4773 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="bb237516-46e4-442a-9643-74c0fbd4b05a" Jan 20 18:34:38 crc kubenswrapper[4773]: I0120 18:34:38.899258 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 20 18:34:38 crc kubenswrapper[4773]: I0120 18:34:38.970168 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 20 18:34:39 crc kubenswrapper[4773]: I0120 18:34:39.482640 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 20 18:34:39 crc kubenswrapper[4773]: I0120 18:34:39.565695 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 20 18:34:39 crc kubenswrapper[4773]: I0120 18:34:39.625979 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.125055 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.171244 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.196243 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.341007 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.477792 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.491011 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.646344 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.679574 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.807704 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 20 18:34:40 crc kubenswrapper[4773]: I0120 18:34:40.882419 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.052225 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.531799 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.653852 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.711994 4773 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.719642 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.725217 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.790601 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.829126 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.853083 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 20 18:34:41 crc kubenswrapper[4773]: I0120 18:34:41.940440 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.031618 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.038390 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.168846 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.286280 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.310895 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.356597 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.373334 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.429569 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.553007 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.567861 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.603740 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.741995 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.801879 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.975387 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 20 18:34:42 crc kubenswrapper[4773]: I0120 18:34:42.994112 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.069209 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.094193 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.315814 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.317750 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.350475 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.386166 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.521742 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.545628 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.552879 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.561760 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.617123 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.708031 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.718040 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.788920 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.836008 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.844861 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.875622 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 20 18:34:43 crc kubenswrapper[4773]: I0120 18:34:43.993713 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.094886 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.252718 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.369596 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.372386 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.416545 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.486055 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.505657 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.505716 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.550873 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.569975 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.666804 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.730815 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.736535 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.738497 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.795786 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.830923 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.864027 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.907829 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.910530 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.912592 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.981876 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.990468 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 20 18:34:44 crc kubenswrapper[4773]: I0120 18:34:44.992212 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.018290 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.036227 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.056406 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.279181 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.281243 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.316858 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.345065 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.347382 4773 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.347916 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.370271 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.520540 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.592693 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.600846 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.684649 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.726792 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.767527 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.772832 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.942063 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 20 18:34:45 crc kubenswrapper[4773]: I0120 18:34:45.945406 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.012828 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.037548 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.041238 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.071764 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.095811 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.182775 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.273864 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.489185 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.507677 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.599754 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.669950 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.837429 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.837893 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.897666 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 20 18:34:46 crc kubenswrapper[4773]: I0120 18:34:46.938771 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.167728 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.187535 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.197840 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.198554 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.227981 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.242994 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.264682 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.289652 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.346785 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.421534 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.449494 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.467973 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.483922 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.502624 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.513087 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.533773 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.551050 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.625890 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.645436 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.652206 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.669274 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.712854 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.734023 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.788652 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.789229 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.789872 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.803291 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.811547 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.847300 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 20 18:34:47 crc kubenswrapper[4773]: I0120 18:34:47.851341 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.077839 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.115458 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.145141 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.204497 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.218952 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.284579 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.322241 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.414628 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.632640 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.676702 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.725756 4773 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.844089 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 20 18:34:48 crc kubenswrapper[4773]: I0120 18:34:48.909524 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:48.919515 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.130428 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.130443 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.170452 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.171715 4773 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.262377 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.326973 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.468702 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.486895 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.533403 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.598919 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.669430 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 20 18:34:49 crc kubenswrapper[4773]: I0120 18:34:49.774878 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.016844 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.207015 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.242355 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.328658 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.353595 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.414831 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.464711 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.594498 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.624043 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.648199 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.731751 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.866121 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.892418 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.947315 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 20 18:34:50 crc kubenswrapper[4773]: I0120 18:34:50.997425 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.079336 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.151644 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.163115 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.195148 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.210093 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.305107 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.368375 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.373707 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.425812 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.481077 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.542489 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.578062 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.585913 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.592566 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.731104 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.746226 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.747691 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.808079 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.868324 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.878372 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.880410 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 20 18:34:51 crc kubenswrapper[4773]: I0120 18:34:51.901891 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.004111 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.056596 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.080395 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.121202 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.137216 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.266688 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.335371 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.369286 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.514401 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.674895 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.675206 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.703109 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.711817 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.779278 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.785004 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.841231 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.849785 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.912329 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 20 18:34:52 crc kubenswrapper[4773]: I0120 18:34:52.987471 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.041074 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.234770 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.313271 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.350835 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.410904 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.460043 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.487300 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.564025 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.572572 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.606419 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.669183 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.704317 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.909438 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.959718 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.963466 4773 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.967787 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.967838 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.973042 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.981400 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 20 18:34:53 crc kubenswrapper[4773]: I0120 18:34:53.986153 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=24.986136936 podStartE2EDuration="24.986136936s" podCreationTimestamp="2026-01-20 18:34:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:34:53.985504521 +0000 UTC m=+286.907317555" watchObservedRunningTime="2026-01-20 18:34:53.986136936 +0000 UTC m=+286.907949960" Jan 20 18:34:54 crc kubenswrapper[4773]: I0120 18:34:54.564497 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 20 18:34:54 crc kubenswrapper[4773]: I0120 18:34:54.826673 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 20 18:34:55 crc kubenswrapper[4773]: I0120 18:34:55.019175 4773 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 20 18:34:55 crc kubenswrapper[4773]: I0120 18:34:55.155804 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 20 18:34:55 crc kubenswrapper[4773]: I0120 18:34:55.697431 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 20 18:34:56 crc kubenswrapper[4773]: I0120 18:34:56.196085 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 20 18:34:56 crc kubenswrapper[4773]: I0120 18:34:56.315713 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 20 18:34:56 crc kubenswrapper[4773]: I0120 18:34:56.340078 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 20 18:34:56 crc kubenswrapper[4773]: I0120 18:34:56.612248 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 20 18:35:03 crc kubenswrapper[4773]: I0120 18:35:03.195315 4773 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 20 18:35:03 crc kubenswrapper[4773]: I0120 18:35:03.195946 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://ead94a7d10d1f90fc4310de40534fc1abd743b52e61d834ccd031ce2d9f74b37" gracePeriod=5 Jan 20 18:35:07 crc kubenswrapper[4773]: I0120 18:35:07.261241 4773 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.623457 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.623801 4773 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="ead94a7d10d1f90fc4310de40534fc1abd743b52e61d834ccd031ce2d9f74b37" exitCode=137 Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.765307 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.765567 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.925771 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926115 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926164 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926306 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926407 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926333 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926458 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926508 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926715 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926786 4773 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.926880 4773 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.927008 4773 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 20 18:35:08 crc kubenswrapper[4773]: I0120 18:35:08.934561 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:35:09 crc kubenswrapper[4773]: I0120 18:35:09.027742 4773 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 20 18:35:09 crc kubenswrapper[4773]: I0120 18:35:09.027798 4773 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 20 18:35:09 crc kubenswrapper[4773]: I0120 18:35:09.452646 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 20 18:35:09 crc kubenswrapper[4773]: I0120 18:35:09.636000 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 20 18:35:09 crc kubenswrapper[4773]: I0120 18:35:09.636636 4773 scope.go:117] "RemoveContainer" containerID="ead94a7d10d1f90fc4310de40534fc1abd743b52e61d834ccd031ce2d9f74b37" Jan 20 18:35:09 crc kubenswrapper[4773]: I0120 18:35:09.636751 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 20 18:35:58 crc kubenswrapper[4773]: I0120 18:35:58.170091 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:35:58 crc kubenswrapper[4773]: I0120 18:35:58.170720 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.666527 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p2lbk"] Jan 20 18:36:24 crc kubenswrapper[4773]: E0120 18:36:24.667620 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29920243-d87d-49b3-9215-680935300c6e" containerName="installer" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.667643 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="29920243-d87d-49b3-9215-680935300c6e" containerName="installer" Jan 20 18:36:24 crc kubenswrapper[4773]: E0120 18:36:24.667674 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.667685 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.667854 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.667869 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="29920243-d87d-49b3-9215-680935300c6e" containerName="installer" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.668457 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.700355 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p2lbk"] Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767101 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-bound-sa-token\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f61a15ab-aa00-47cf-8385-57f3e148832e-registry-certificates\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767223 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f61a15ab-aa00-47cf-8385-57f3e148832e-trusted-ca\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767250 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-registry-tls\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767419 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767466 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f61a15ab-aa00-47cf-8385-57f3e148832e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767574 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nk5lf\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-kube-api-access-nk5lf\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.767694 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f61a15ab-aa00-47cf-8385-57f3e148832e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.786449 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.868851 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nk5lf\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-kube-api-access-nk5lf\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.868913 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f61a15ab-aa00-47cf-8385-57f3e148832e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.868948 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-bound-sa-token\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.868977 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f61a15ab-aa00-47cf-8385-57f3e148832e-registry-certificates\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.868993 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f61a15ab-aa00-47cf-8385-57f3e148832e-trusted-ca\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.869011 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-registry-tls\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.869034 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f61a15ab-aa00-47cf-8385-57f3e148832e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.869460 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f61a15ab-aa00-47cf-8385-57f3e148832e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.870375 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f61a15ab-aa00-47cf-8385-57f3e148832e-trusted-ca\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.870980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f61a15ab-aa00-47cf-8385-57f3e148832e-registry-certificates\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.874069 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f61a15ab-aa00-47cf-8385-57f3e148832e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.874119 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-registry-tls\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.886594 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nk5lf\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-kube-api-access-nk5lf\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.889285 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f61a15ab-aa00-47cf-8385-57f3e148832e-bound-sa-token\") pod \"image-registry-66df7c8f76-p2lbk\" (UID: \"f61a15ab-aa00-47cf-8385-57f3e148832e\") " pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:24 crc kubenswrapper[4773]: I0120 18:36:24.988854 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:25 crc kubenswrapper[4773]: I0120 18:36:25.169050 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-p2lbk"] Jan 20 18:36:25 crc kubenswrapper[4773]: W0120 18:36:25.174193 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf61a15ab_aa00_47cf_8385_57f3e148832e.slice/crio-bca4bf81d9803133a59d7a85b29d3f70a372ebc6dc49d0d9ee417d28afcf6594 WatchSource:0}: Error finding container bca4bf81d9803133a59d7a85b29d3f70a372ebc6dc49d0d9ee417d28afcf6594: Status 404 returned error can't find the container with id bca4bf81d9803133a59d7a85b29d3f70a372ebc6dc49d0d9ee417d28afcf6594 Jan 20 18:36:26 crc kubenswrapper[4773]: I0120 18:36:26.041488 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" event={"ID":"f61a15ab-aa00-47cf-8385-57f3e148832e","Type":"ContainerStarted","Data":"3bc63f71dd80f941c8bf8b10d174ceb11ea94c90531a13b46795b9c38db75e0d"} Jan 20 18:36:26 crc kubenswrapper[4773]: I0120 18:36:26.041904 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" event={"ID":"f61a15ab-aa00-47cf-8385-57f3e148832e","Type":"ContainerStarted","Data":"bca4bf81d9803133a59d7a85b29d3f70a372ebc6dc49d0d9ee417d28afcf6594"} Jan 20 18:36:26 crc kubenswrapper[4773]: I0120 18:36:26.042149 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:26 crc kubenswrapper[4773]: I0120 18:36:26.063292 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" podStartSLOduration=2.06327423 podStartE2EDuration="2.06327423s" podCreationTimestamp="2026-01-20 18:36:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:36:26.063106557 +0000 UTC m=+378.984919571" watchObservedRunningTime="2026-01-20 18:36:26.06327423 +0000 UTC m=+378.985087254" Jan 20 18:36:28 crc kubenswrapper[4773]: I0120 18:36:28.170596 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:36:28 crc kubenswrapper[4773]: I0120 18:36:28.171215 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:36:38 crc kubenswrapper[4773]: I0120 18:36:38.965575 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v75d6"] Jan 20 18:36:38 crc kubenswrapper[4773]: I0120 18:36:38.966750 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v75d6" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="registry-server" containerID="cri-o://20afc5cca64346d7cedcc97043ec4130c07cbb4a3ad2fa66f0da171dcc8edd41" gracePeriod=30 Jan 20 18:36:38 crc kubenswrapper[4773]: I0120 18:36:38.980035 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2qdpl"] Jan 20 18:36:38 crc kubenswrapper[4773]: I0120 18:36:38.980513 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2qdpl" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="registry-server" containerID="cri-o://042139096fac743c4bffa6cf49536d998523d940f380114bd1fdad3392d0743d" gracePeriod=30 Jan 20 18:36:38 crc kubenswrapper[4773]: I0120 18:36:38.987459 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ff9dd"] Jan 20 18:36:38 crc kubenswrapper[4773]: I0120 18:36:38.995550 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerName="marketplace-operator" containerID="cri-o://cb0111d26068d7530fc1ab48948aef101d8e352e9c86e33d1e7d039ec97d0ab3" gracePeriod=30 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:38.999206 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4bcd"] Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:38.999524 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-w4bcd" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="registry-server" containerID="cri-o://f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07" gracePeriod=30 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.025003 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kcc74"] Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.025993 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.034170 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fm4ln"] Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.034370 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-fm4ln" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="registry-server" containerID="cri-o://405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1" gracePeriod=30 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.045326 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kcc74"] Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.133973 4773 generic.go:334] "Generic (PLEG): container finished" podID="074f367d-7a48-4046-a679-9a2d38111b8a" containerID="20afc5cca64346d7cedcc97043ec4130c07cbb4a3ad2fa66f0da171dcc8edd41" exitCode=0 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.134083 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75d6" event={"ID":"074f367d-7a48-4046-a679-9a2d38111b8a","Type":"ContainerDied","Data":"20afc5cca64346d7cedcc97043ec4130c07cbb4a3ad2fa66f0da171dcc8edd41"} Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.137103 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerID="f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07" exitCode=0 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.137214 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4bcd" event={"ID":"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2","Type":"ContainerDied","Data":"f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07"} Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.139731 4773 generic.go:334] "Generic (PLEG): container finished" podID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerID="042139096fac743c4bffa6cf49536d998523d940f380114bd1fdad3392d0743d" exitCode=0 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.139815 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerDied","Data":"042139096fac743c4bffa6cf49536d998523d940f380114bd1fdad3392d0743d"} Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.141075 4773 generic.go:334] "Generic (PLEG): container finished" podID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerID="cb0111d26068d7530fc1ab48948aef101d8e352e9c86e33d1e7d039ec97d0ab3" exitCode=0 Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.141108 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" event={"ID":"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85","Type":"ContainerDied","Data":"cb0111d26068d7530fc1ab48948aef101d8e352e9c86e33d1e7d039ec97d0ab3"} Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.191028 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr6pn\" (UniqueName: \"kubernetes.io/projected/785e6f78-9a81-429e-8cad-f60275661e58-kube-api-access-pr6pn\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.191106 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/785e6f78-9a81-429e-8cad-f60275661e58-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.191138 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/785e6f78-9a81-429e-8cad-f60275661e58-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: E0120 18:36:39.261391 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07 is running failed: container process not found" containerID="f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07" cmd=["grpc_health_probe","-addr=:50051"] Jan 20 18:36:39 crc kubenswrapper[4773]: E0120 18:36:39.261694 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07 is running failed: container process not found" containerID="f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07" cmd=["grpc_health_probe","-addr=:50051"] Jan 20 18:36:39 crc kubenswrapper[4773]: E0120 18:36:39.262079 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07 is running failed: container process not found" containerID="f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07" cmd=["grpc_health_probe","-addr=:50051"] Jan 20 18:36:39 crc kubenswrapper[4773]: E0120 18:36:39.262116 4773 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-marketplace-w4bcd" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="registry-server" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.292832 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr6pn\" (UniqueName: \"kubernetes.io/projected/785e6f78-9a81-429e-8cad-f60275661e58-kube-api-access-pr6pn\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.293206 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/785e6f78-9a81-429e-8cad-f60275661e58-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.293235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/785e6f78-9a81-429e-8cad-f60275661e58-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.296140 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/785e6f78-9a81-429e-8cad-f60275661e58-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.299320 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/785e6f78-9a81-429e-8cad-f60275661e58-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.314830 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr6pn\" (UniqueName: \"kubernetes.io/projected/785e6f78-9a81-429e-8cad-f60275661e58-kube-api-access-pr6pn\") pod \"marketplace-operator-79b997595-kcc74\" (UID: \"785e6f78-9a81-429e-8cad-f60275661e58\") " pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.347830 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.397440 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.487988 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.500313 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.503194 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.548346 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.595289 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vzmpk\" (UniqueName: \"kubernetes.io/projected/e5fd624a-2fa6-4887-83e0-779057846c71-kube-api-access-vzmpk\") pod \"e5fd624a-2fa6-4887-83e0-779057846c71\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.595501 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fbj8l\" (UniqueName: \"kubernetes.io/projected/074f367d-7a48-4046-a679-9a2d38111b8a-kube-api-access-fbj8l\") pod \"074f367d-7a48-4046-a679-9a2d38111b8a\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.595550 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-catalog-content\") pod \"e5fd624a-2fa6-4887-83e0-779057846c71\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.595575 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-utilities\") pod \"074f367d-7a48-4046-a679-9a2d38111b8a\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.595641 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-catalog-content\") pod \"074f367d-7a48-4046-a679-9a2d38111b8a\" (UID: \"074f367d-7a48-4046-a679-9a2d38111b8a\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.595661 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-utilities\") pod \"e5fd624a-2fa6-4887-83e0-779057846c71\" (UID: \"e5fd624a-2fa6-4887-83e0-779057846c71\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.596911 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-utilities" (OuterVolumeSpecName: "utilities") pod "e5fd624a-2fa6-4887-83e0-779057846c71" (UID: "e5fd624a-2fa6-4887-83e0-779057846c71"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.597893 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-utilities" (OuterVolumeSpecName: "utilities") pod "074f367d-7a48-4046-a679-9a2d38111b8a" (UID: "074f367d-7a48-4046-a679-9a2d38111b8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.598698 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/074f367d-7a48-4046-a679-9a2d38111b8a-kube-api-access-fbj8l" (OuterVolumeSpecName: "kube-api-access-fbj8l") pod "074f367d-7a48-4046-a679-9a2d38111b8a" (UID: "074f367d-7a48-4046-a679-9a2d38111b8a"). InnerVolumeSpecName "kube-api-access-fbj8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.600710 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5fd624a-2fa6-4887-83e0-779057846c71-kube-api-access-vzmpk" (OuterVolumeSpecName: "kube-api-access-vzmpk") pod "e5fd624a-2fa6-4887-83e0-779057846c71" (UID: "e5fd624a-2fa6-4887-83e0-779057846c71"). InnerVolumeSpecName "kube-api-access-vzmpk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.647435 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "074f367d-7a48-4046-a679-9a2d38111b8a" (UID: "074f367d-7a48-4046-a679-9a2d38111b8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.696841 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dppcc\" (UniqueName: \"kubernetes.io/projected/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-kube-api-access-dppcc\") pod \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.696892 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-operator-metrics\") pod \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.697310 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-catalog-content\") pod \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.697370 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-kube-api-access-kfrvq\") pod \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.697404 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-utilities\") pod \"8923f3c0-0b58-4097-aa87-9df34cf90e41\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.697483 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-utilities\") pod \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\" (UID: \"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.697512 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-catalog-content\") pod \"8923f3c0-0b58-4097-aa87-9df34cf90e41\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.698399 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-utilities" (OuterVolumeSpecName: "utilities") pod "c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" (UID: "c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.699335 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-utilities" (OuterVolumeSpecName: "utilities") pod "8923f3c0-0b58-4097-aa87-9df34cf90e41" (UID: "8923f3c0-0b58-4097-aa87-9df34cf90e41"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.699738 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-kube-api-access-dppcc" (OuterVolumeSpecName: "kube-api-access-dppcc") pod "c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" (UID: "c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2"). InnerVolumeSpecName "kube-api-access-dppcc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.699980 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" (UID: "1181b3a9-8cf9-46ad-9b41-62d32ffe7a85"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.701368 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8923f3c0-0b58-4097-aa87-9df34cf90e41-kube-api-access-hngfz" (OuterVolumeSpecName: "kube-api-access-hngfz") pod "8923f3c0-0b58-4097-aa87-9df34cf90e41" (UID: "8923f3c0-0b58-4097-aa87-9df34cf90e41"). InnerVolumeSpecName "kube-api-access-hngfz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.703257 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-kube-api-access-kfrvq" (OuterVolumeSpecName: "kube-api-access-kfrvq") pod "1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" (UID: "1181b3a9-8cf9-46ad-9b41-62d32ffe7a85"). InnerVolumeSpecName "kube-api-access-kfrvq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.703383 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hngfz\" (UniqueName: \"kubernetes.io/projected/8923f3c0-0b58-4097-aa87-9df34cf90e41-kube-api-access-hngfz\") pod \"8923f3c0-0b58-4097-aa87-9df34cf90e41\" (UID: \"8923f3c0-0b58-4097-aa87-9df34cf90e41\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.703429 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-trusted-ca\") pod \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\" (UID: \"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85\") " Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704005 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fbj8l\" (UniqueName: \"kubernetes.io/projected/074f367d-7a48-4046-a679-9a2d38111b8a-kube-api-access-fbj8l\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704031 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704040 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704048 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hngfz\" (UniqueName: \"kubernetes.io/projected/8923f3c0-0b58-4097-aa87-9df34cf90e41-kube-api-access-hngfz\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704057 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/074f367d-7a48-4046-a679-9a2d38111b8a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704066 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704075 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vzmpk\" (UniqueName: \"kubernetes.io/projected/e5fd624a-2fa6-4887-83e0-779057846c71-kube-api-access-vzmpk\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704085 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dppcc\" (UniqueName: \"kubernetes.io/projected/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-kube-api-access-dppcc\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704094 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704104 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfrvq\" (UniqueName: \"kubernetes.io/projected/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-kube-api-access-kfrvq\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704112 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.704003 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" (UID: "1181b3a9-8cf9-46ad-9b41-62d32ffe7a85"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.719554 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" (UID: "c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.720353 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e5fd624a-2fa6-4887-83e0-779057846c71" (UID: "e5fd624a-2fa6-4887-83e0-779057846c71"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.749741 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8923f3c0-0b58-4097-aa87-9df34cf90e41" (UID: "8923f3c0-0b58-4097-aa87-9df34cf90e41"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.807431 4773 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.807469 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.807479 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e5fd624a-2fa6-4887-83e0-779057846c71-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.807487 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8923f3c0-0b58-4097-aa87-9df34cf90e41-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:36:39 crc kubenswrapper[4773]: I0120 18:36:39.814498 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-kcc74"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.147264 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" event={"ID":"785e6f78-9a81-429e-8cad-f60275661e58","Type":"ContainerStarted","Data":"de423276dc21445682b64e1f6484fa7b1373ba1de2188b140b0b53670942a901"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.147645 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" event={"ID":"785e6f78-9a81-429e-8cad-f60275661e58","Type":"ContainerStarted","Data":"7d6db5ee2e2391fe0aa6f228d1a5ee2222dbdff05012c5d45d6ddb93ffe1ce17"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.147663 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.148960 4773 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-kcc74 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" start-of-body= Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.149004 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" podUID="785e6f78-9a81-429e-8cad-f60275661e58" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.66:8080/healthz\": dial tcp 10.217.0.66:8080: connect: connection refused" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.149591 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.150965 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-ff9dd" event={"ID":"1181b3a9-8cf9-46ad-9b41-62d32ffe7a85","Type":"ContainerDied","Data":"f4d91eb42c30324decc0123b0752b77625e1bfc343e356223cf0e111b47451d8"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.151002 4773 scope.go:117] "RemoveContainer" containerID="cb0111d26068d7530fc1ab48948aef101d8e352e9c86e33d1e7d039ec97d0ab3" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.153501 4773 generic.go:334] "Generic (PLEG): container finished" podID="e5fd624a-2fa6-4887-83e0-779057846c71" containerID="405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1" exitCode=0 Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.153553 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerDied","Data":"405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.153617 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fm4ln" event={"ID":"e5fd624a-2fa6-4887-83e0-779057846c71","Type":"ContainerDied","Data":"d8d891af7a0ae24346edc8a46d303844186b0bedf95160cce185884dab78b333"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.153572 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fm4ln" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.157014 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v75d6" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.157013 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v75d6" event={"ID":"074f367d-7a48-4046-a679-9a2d38111b8a","Type":"ContainerDied","Data":"f57839f90df36cf23471ecda170b0c2440316e257ee6cca520a35c728d5b16de"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.159984 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-w4bcd" event={"ID":"c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2","Type":"ContainerDied","Data":"259bcd091e35535624126a0c051a63c6b1167732a3459de7fa71f75dfa10b627"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.159999 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-w4bcd" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.164173 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2qdpl" event={"ID":"8923f3c0-0b58-4097-aa87-9df34cf90e41","Type":"ContainerDied","Data":"47261c669f243247e3360eb031003a9925a21eac9889414fbe72f5ed85389a71"} Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.164221 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2qdpl" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.168723 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" podStartSLOduration=2.168700018 podStartE2EDuration="2.168700018s" podCreationTimestamp="2026-01-20 18:36:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:36:40.164378626 +0000 UTC m=+393.086191670" watchObservedRunningTime="2026-01-20 18:36:40.168700018 +0000 UTC m=+393.090513042" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.177520 4773 scope.go:117] "RemoveContainer" containerID="405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.198822 4773 scope.go:117] "RemoveContainer" containerID="3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.202674 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2qdpl"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.214873 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2qdpl"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.228255 4773 scope.go:117] "RemoveContainer" containerID="bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.230369 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-fm4ln"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.244569 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-fm4ln"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.247886 4773 scope.go:117] "RemoveContainer" containerID="405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.249092 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1\": container with ID starting with 405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1 not found: ID does not exist" containerID="405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.249136 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1"} err="failed to get container status \"405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1\": rpc error: code = NotFound desc = could not find container \"405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1\": container with ID starting with 405b8c5d9a4e0715743edb3ba54972440345c5dabbfc9ae56cb5cc61330344b1 not found: ID does not exist" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.249163 4773 scope.go:117] "RemoveContainer" containerID="3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.249459 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db\": container with ID starting with 3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db not found: ID does not exist" containerID="3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.249475 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db"} err="failed to get container status \"3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db\": rpc error: code = NotFound desc = could not find container \"3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db\": container with ID starting with 3e00fbd0f11ff917989561c509f8b12319467494c08116a3e2715bb0829e11db not found: ID does not exist" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.249488 4773 scope.go:117] "RemoveContainer" containerID="bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.250330 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee\": container with ID starting with bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee not found: ID does not exist" containerID="bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.250363 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee"} err="failed to get container status \"bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee\": rpc error: code = NotFound desc = could not find container \"bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee\": container with ID starting with bfd0647c31aa207aa6af51a11c6586e941d07ba3e5306f8224749c043019abee not found: ID does not exist" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.250386 4773 scope.go:117] "RemoveContainer" containerID="20afc5cca64346d7cedcc97043ec4130c07cbb4a3ad2fa66f0da171dcc8edd41" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.254491 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v75d6"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.260452 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v75d6"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.266830 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ff9dd"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.272283 4773 scope.go:117] "RemoveContainer" containerID="639cb92015efe00db0ab47ee3303403c16b478767a9030340d303516dfaf2e8d" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.272865 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-ff9dd"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.282419 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4bcd"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.285073 4773 scope.go:117] "RemoveContainer" containerID="1f220958033235a6f9fe2c2b2ebf17e5764f53dc8958a6ac265dd5f47e11eb7e" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.285463 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-w4bcd"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.304913 4773 scope.go:117] "RemoveContainer" containerID="f73bf4550870764bd71e43342e3a9b1626bd2febed0bfc8cfab7c1eb1fcc0b07" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.321192 4773 scope.go:117] "RemoveContainer" containerID="93be4aeabcc19519cd8451ce33f2117a534a92e9ae0b9b81378e69932d400b91" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.339681 4773 scope.go:117] "RemoveContainer" containerID="ca09fdf16fb11b2b53675f7f94bd271507c5f87bdae39648e8c76e9ebf18f6ca" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.365163 4773 scope.go:117] "RemoveContainer" containerID="042139096fac743c4bffa6cf49536d998523d940f380114bd1fdad3392d0743d" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.380429 4773 scope.go:117] "RemoveContainer" containerID="1f287c2574683f5354d74f9901af35491b27963de1526d87ad8eff7eb251368c" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.395347 4773 scope.go:117] "RemoveContainer" containerID="2fb65d95b1dd9e1def202549dcf0c536be64e92ad04a8874773fb7a70a7be1b9" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.783476 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lnll4"] Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784106 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784130 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784143 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerName="marketplace-operator" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784151 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerName="marketplace-operator" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784161 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784169 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784180 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784188 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784200 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784208 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784218 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784226 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784240 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784248 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784256 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784263 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="extract-content" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784277 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784285 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="extract-utilities" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784294 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784301 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784311 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784318 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784327 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784335 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: E0120 18:36:40.784348 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784355 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784483 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" containerName="marketplace-operator" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784495 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784505 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784513 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.784519 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" containerName="registry-server" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.785809 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.788090 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.792082 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lnll4"] Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.834227 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js59s\" (UniqueName: \"kubernetes.io/projected/5da64480-a8e7-4ab9-b438-dfe067f94091-kube-api-access-js59s\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.834303 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da64480-a8e7-4ab9-b438-dfe067f94091-catalog-content\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.834411 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da64480-a8e7-4ab9-b438-dfe067f94091-utilities\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.936084 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da64480-a8e7-4ab9-b438-dfe067f94091-utilities\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.936176 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js59s\" (UniqueName: \"kubernetes.io/projected/5da64480-a8e7-4ab9-b438-dfe067f94091-kube-api-access-js59s\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.936214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da64480-a8e7-4ab9-b438-dfe067f94091-catalog-content\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.937032 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5da64480-a8e7-4ab9-b438-dfe067f94091-catalog-content\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.937050 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5da64480-a8e7-4ab9-b438-dfe067f94091-utilities\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:40 crc kubenswrapper[4773]: I0120 18:36:40.959238 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js59s\" (UniqueName: \"kubernetes.io/projected/5da64480-a8e7-4ab9-b438-dfe067f94091-kube-api-access-js59s\") pod \"certified-operators-lnll4\" (UID: \"5da64480-a8e7-4ab9-b438-dfe067f94091\") " pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.133354 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.181010 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-kcc74" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.345197 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lnll4"] Jan 20 18:36:41 crc kubenswrapper[4773]: W0120 18:36:41.368349 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5da64480_a8e7_4ab9_b438_dfe067f94091.slice/crio-d4c0b8b1d14beca59598a88d65621e676143c836faa0137fb77d64ac6458284c WatchSource:0}: Error finding container d4c0b8b1d14beca59598a88d65621e676143c836faa0137fb77d64ac6458284c: Status 404 returned error can't find the container with id d4c0b8b1d14beca59598a88d65621e676143c836faa0137fb77d64ac6458284c Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.454578 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="074f367d-7a48-4046-a679-9a2d38111b8a" path="/var/lib/kubelet/pods/074f367d-7a48-4046-a679-9a2d38111b8a/volumes" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.455296 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1181b3a9-8cf9-46ad-9b41-62d32ffe7a85" path="/var/lib/kubelet/pods/1181b3a9-8cf9-46ad-9b41-62d32ffe7a85/volumes" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.455732 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8923f3c0-0b58-4097-aa87-9df34cf90e41" path="/var/lib/kubelet/pods/8923f3c0-0b58-4097-aa87-9df34cf90e41/volumes" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.456776 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2" path="/var/lib/kubelet/pods/c9e2e310-70dc-4eb9-94f3-2e4466e2b7d2/volumes" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.457619 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5fd624a-2fa6-4887-83e0-779057846c71" path="/var/lib/kubelet/pods/e5fd624a-2fa6-4887-83e0-779057846c71/volumes" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.779363 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wdwbg"] Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.780309 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.783369 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.802351 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wdwbg"] Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.847727 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n8jw\" (UniqueName: \"kubernetes.io/projected/379f8421-1b6c-45c5-ae56-051b42ff6410-kube-api-access-2n8jw\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.847815 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379f8421-1b6c-45c5-ae56-051b42ff6410-utilities\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.847861 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379f8421-1b6c-45c5-ae56-051b42ff6410-catalog-content\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.949682 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n8jw\" (UniqueName: \"kubernetes.io/projected/379f8421-1b6c-45c5-ae56-051b42ff6410-kube-api-access-2n8jw\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.949736 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379f8421-1b6c-45c5-ae56-051b42ff6410-utilities\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.949766 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379f8421-1b6c-45c5-ae56-051b42ff6410-catalog-content\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.950521 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/379f8421-1b6c-45c5-ae56-051b42ff6410-catalog-content\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.950565 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/379f8421-1b6c-45c5-ae56-051b42ff6410-utilities\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:41 crc kubenswrapper[4773]: I0120 18:36:41.973068 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n8jw\" (UniqueName: \"kubernetes.io/projected/379f8421-1b6c-45c5-ae56-051b42ff6410-kube-api-access-2n8jw\") pod \"redhat-marketplace-wdwbg\" (UID: \"379f8421-1b6c-45c5-ae56-051b42ff6410\") " pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:42 crc kubenswrapper[4773]: I0120 18:36:42.109308 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:42 crc kubenswrapper[4773]: I0120 18:36:42.187391 4773 generic.go:334] "Generic (PLEG): container finished" podID="5da64480-a8e7-4ab9-b438-dfe067f94091" containerID="d9f7e3449020892e7ad4a475051e5957ae64148987fa051d3306ccdce3086869" exitCode=0 Jan 20 18:36:42 crc kubenswrapper[4773]: I0120 18:36:42.187541 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnll4" event={"ID":"5da64480-a8e7-4ab9-b438-dfe067f94091","Type":"ContainerDied","Data":"d9f7e3449020892e7ad4a475051e5957ae64148987fa051d3306ccdce3086869"} Jan 20 18:36:42 crc kubenswrapper[4773]: I0120 18:36:42.187596 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnll4" event={"ID":"5da64480-a8e7-4ab9-b438-dfe067f94091","Type":"ContainerStarted","Data":"d4c0b8b1d14beca59598a88d65621e676143c836faa0137fb77d64ac6458284c"} Jan 20 18:36:42 crc kubenswrapper[4773]: I0120 18:36:42.523673 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wdwbg"] Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.175322 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-r24nn"] Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.178401 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.183666 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.186454 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r24nn"] Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.201799 4773 generic.go:334] "Generic (PLEG): container finished" podID="379f8421-1b6c-45c5-ae56-051b42ff6410" containerID="4a23108bb87b7b34f2c6a6518788bcca556bbe160ca79537ca82165cbb3dfb8f" exitCode=0 Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.201856 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdwbg" event={"ID":"379f8421-1b6c-45c5-ae56-051b42ff6410","Type":"ContainerDied","Data":"4a23108bb87b7b34f2c6a6518788bcca556bbe160ca79537ca82165cbb3dfb8f"} Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.201879 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdwbg" event={"ID":"379f8421-1b6c-45c5-ae56-051b42ff6410","Type":"ContainerStarted","Data":"250c3a8173a2d1bf9e0b3726862b2b3ea37f90f03a172273b24b5f16a6378d3c"} Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.209088 4773 generic.go:334] "Generic (PLEG): container finished" podID="5da64480-a8e7-4ab9-b438-dfe067f94091" containerID="07365db81d19a57af32492ba798cf44af6edc2c8c1a6bd0c2614bc04e6e9066a" exitCode=0 Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.209143 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnll4" event={"ID":"5da64480-a8e7-4ab9-b438-dfe067f94091","Type":"ContainerDied","Data":"07365db81d19a57af32492ba798cf44af6edc2c8c1a6bd0c2614bc04e6e9066a"} Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.266354 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ckqw\" (UniqueName: \"kubernetes.io/projected/7962399c-d4d0-44f1-a788-bd4cb5a758d7-kube-api-access-8ckqw\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.266398 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7962399c-d4d0-44f1-a788-bd4cb5a758d7-catalog-content\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.266441 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7962399c-d4d0-44f1-a788-bd4cb5a758d7-utilities\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.367447 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ckqw\" (UniqueName: \"kubernetes.io/projected/7962399c-d4d0-44f1-a788-bd4cb5a758d7-kube-api-access-8ckqw\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.367495 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7962399c-d4d0-44f1-a788-bd4cb5a758d7-catalog-content\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.367530 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7962399c-d4d0-44f1-a788-bd4cb5a758d7-utilities\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.367952 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7962399c-d4d0-44f1-a788-bd4cb5a758d7-utilities\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.368384 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7962399c-d4d0-44f1-a788-bd4cb5a758d7-catalog-content\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.391247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ckqw\" (UniqueName: \"kubernetes.io/projected/7962399c-d4d0-44f1-a788-bd4cb5a758d7-kube-api-access-8ckqw\") pod \"redhat-operators-r24nn\" (UID: \"7962399c-d4d0-44f1-a788-bd4cb5a758d7\") " pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.506251 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:43 crc kubenswrapper[4773]: I0120 18:36:43.871381 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-r24nn"] Jan 20 18:36:43 crc kubenswrapper[4773]: W0120 18:36:43.879673 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7962399c_d4d0_44f1_a788_bd4cb5a758d7.slice/crio-ac06b8d79127fd6944eed58ba09b3b4743239b6f8d2424ef74e324d3fb005bb5 WatchSource:0}: Error finding container ac06b8d79127fd6944eed58ba09b3b4743239b6f8d2424ef74e324d3fb005bb5: Status 404 returned error can't find the container with id ac06b8d79127fd6944eed58ba09b3b4743239b6f8d2424ef74e324d3fb005bb5 Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.177510 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-cfwbf"] Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.183534 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.190001 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.195690 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfwbf"] Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.216121 4773 generic.go:334] "Generic (PLEG): container finished" podID="379f8421-1b6c-45c5-ae56-051b42ff6410" containerID="fd1f902118fb9db32d3da8ab9596feec0155aa7e95e11ea6a0363437b4776f3e" exitCode=0 Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.216303 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdwbg" event={"ID":"379f8421-1b6c-45c5-ae56-051b42ff6410","Type":"ContainerDied","Data":"fd1f902118fb9db32d3da8ab9596feec0155aa7e95e11ea6a0363437b4776f3e"} Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.219816 4773 generic.go:334] "Generic (PLEG): container finished" podID="7962399c-d4d0-44f1-a788-bd4cb5a758d7" containerID="485255b30ab14edef9260ad2522abe87424691af4e1eb171e6c5173a8679fb83" exitCode=0 Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.220464 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r24nn" event={"ID":"7962399c-d4d0-44f1-a788-bd4cb5a758d7","Type":"ContainerDied","Data":"485255b30ab14edef9260ad2522abe87424691af4e1eb171e6c5173a8679fb83"} Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.220581 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r24nn" event={"ID":"7962399c-d4d0-44f1-a788-bd4cb5a758d7","Type":"ContainerStarted","Data":"ac06b8d79127fd6944eed58ba09b3b4743239b6f8d2424ef74e324d3fb005bb5"} Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.225372 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lnll4" event={"ID":"5da64480-a8e7-4ab9-b438-dfe067f94091","Type":"ContainerStarted","Data":"2a01cf48237122fbba8c740a3574c7640592732f28ea2f3c94c797478d8e1570"} Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.254909 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lnll4" podStartSLOduration=2.823319655 podStartE2EDuration="4.254896373s" podCreationTimestamp="2026-01-20 18:36:40 +0000 UTC" firstStartedPulling="2026-01-20 18:36:42.190053026 +0000 UTC m=+395.111866050" lastFinishedPulling="2026-01-20 18:36:43.621629714 +0000 UTC m=+396.543442768" observedRunningTime="2026-01-20 18:36:44.254717529 +0000 UTC m=+397.176530573" watchObservedRunningTime="2026-01-20 18:36:44.254896373 +0000 UTC m=+397.176709397" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.283650 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnnw9\" (UniqueName: \"kubernetes.io/projected/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-kube-api-access-dnnw9\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.283710 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-utilities\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.283748 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-catalog-content\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.385224 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dnnw9\" (UniqueName: \"kubernetes.io/projected/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-kube-api-access-dnnw9\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.385277 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-utilities\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.385324 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-catalog-content\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.385858 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-catalog-content\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.385962 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-utilities\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.407275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnnw9\" (UniqueName: \"kubernetes.io/projected/a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3-kube-api-access-dnnw9\") pod \"community-operators-cfwbf\" (UID: \"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3\") " pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.511039 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.758901 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-cfwbf"] Jan 20 18:36:44 crc kubenswrapper[4773]: W0120 18:36:44.767877 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8ccd26b_7e5c_4655_a9cb_764a2d7d35d3.slice/crio-90dda15a24c58a3700e173093684115154e9dc47ecfe3b908a2202106e8a7322 WatchSource:0}: Error finding container 90dda15a24c58a3700e173093684115154e9dc47ecfe3b908a2202106e8a7322: Status 404 returned error can't find the container with id 90dda15a24c58a3700e173093684115154e9dc47ecfe3b908a2202106e8a7322 Jan 20 18:36:44 crc kubenswrapper[4773]: I0120 18:36:44.995287 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-p2lbk" Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.054728 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kr4zh"] Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.233831 4773 generic.go:334] "Generic (PLEG): container finished" podID="a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3" containerID="552ac0644b31a05767b5893b59defbc42ee6dff8158f25f2bbb4a7da0e807835" exitCode=0 Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.234082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwbf" event={"ID":"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3","Type":"ContainerDied","Data":"552ac0644b31a05767b5893b59defbc42ee6dff8158f25f2bbb4a7da0e807835"} Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.238841 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwbf" event={"ID":"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3","Type":"ContainerStarted","Data":"90dda15a24c58a3700e173093684115154e9dc47ecfe3b908a2202106e8a7322"} Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.239199 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wdwbg" event={"ID":"379f8421-1b6c-45c5-ae56-051b42ff6410","Type":"ContainerStarted","Data":"4ddd2085ebf13d395dc92e1fbe5f131161bb777ca68bb10a2a54266e7778169f"} Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.242600 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r24nn" event={"ID":"7962399c-d4d0-44f1-a788-bd4cb5a758d7","Type":"ContainerStarted","Data":"3be6d233735be758353181960bcdc3b7c9187830dd71ad03d1f5404cd35e259a"} Jan 20 18:36:45 crc kubenswrapper[4773]: I0120 18:36:45.283436 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wdwbg" podStartSLOduration=2.860261285 podStartE2EDuration="4.283420704s" podCreationTimestamp="2026-01-20 18:36:41 +0000 UTC" firstStartedPulling="2026-01-20 18:36:43.203272256 +0000 UTC m=+396.125085280" lastFinishedPulling="2026-01-20 18:36:44.626431675 +0000 UTC m=+397.548244699" observedRunningTime="2026-01-20 18:36:45.281233363 +0000 UTC m=+398.203046397" watchObservedRunningTime="2026-01-20 18:36:45.283420704 +0000 UTC m=+398.205233728" Jan 20 18:36:46 crc kubenswrapper[4773]: I0120 18:36:46.249093 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwbf" event={"ID":"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3","Type":"ContainerStarted","Data":"b91b473ee7a3625c5bf90c24fbde333f921c672087d27307e232e1e07fac0731"} Jan 20 18:36:46 crc kubenswrapper[4773]: I0120 18:36:46.251618 4773 generic.go:334] "Generic (PLEG): container finished" podID="7962399c-d4d0-44f1-a788-bd4cb5a758d7" containerID="3be6d233735be758353181960bcdc3b7c9187830dd71ad03d1f5404cd35e259a" exitCode=0 Jan 20 18:36:46 crc kubenswrapper[4773]: I0120 18:36:46.251675 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r24nn" event={"ID":"7962399c-d4d0-44f1-a788-bd4cb5a758d7","Type":"ContainerDied","Data":"3be6d233735be758353181960bcdc3b7c9187830dd71ad03d1f5404cd35e259a"} Jan 20 18:36:47 crc kubenswrapper[4773]: I0120 18:36:47.257368 4773 generic.go:334] "Generic (PLEG): container finished" podID="a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3" containerID="b91b473ee7a3625c5bf90c24fbde333f921c672087d27307e232e1e07fac0731" exitCode=0 Jan 20 18:36:47 crc kubenswrapper[4773]: I0120 18:36:47.257432 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwbf" event={"ID":"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3","Type":"ContainerDied","Data":"b91b473ee7a3625c5bf90c24fbde333f921c672087d27307e232e1e07fac0731"} Jan 20 18:36:47 crc kubenswrapper[4773]: I0120 18:36:47.261400 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-r24nn" event={"ID":"7962399c-d4d0-44f1-a788-bd4cb5a758d7","Type":"ContainerStarted","Data":"dbbf71e25e3224bd3743e609e818a66704a2d9f07db792aec73f085912fa58de"} Jan 20 18:36:47 crc kubenswrapper[4773]: I0120 18:36:47.297515 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-r24nn" podStartSLOduration=1.844056438 podStartE2EDuration="4.297496571s" podCreationTimestamp="2026-01-20 18:36:43 +0000 UTC" firstStartedPulling="2026-01-20 18:36:44.22221502 +0000 UTC m=+397.144028044" lastFinishedPulling="2026-01-20 18:36:46.675655133 +0000 UTC m=+399.597468177" observedRunningTime="2026-01-20 18:36:47.295083344 +0000 UTC m=+400.216896368" watchObservedRunningTime="2026-01-20 18:36:47.297496571 +0000 UTC m=+400.219309585" Jan 20 18:36:50 crc kubenswrapper[4773]: I0120 18:36:50.281474 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-cfwbf" event={"ID":"a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3","Type":"ContainerStarted","Data":"93607e4df958788ebb45f47062e62f14c9ee37bf6b263a952e3b8aed59c3c6f5"} Jan 20 18:36:50 crc kubenswrapper[4773]: I0120 18:36:50.299534 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-cfwbf" podStartSLOduration=3.308369548 podStartE2EDuration="6.29951701s" podCreationTimestamp="2026-01-20 18:36:44 +0000 UTC" firstStartedPulling="2026-01-20 18:36:45.235361648 +0000 UTC m=+398.157174672" lastFinishedPulling="2026-01-20 18:36:48.22650911 +0000 UTC m=+401.148322134" observedRunningTime="2026-01-20 18:36:50.295830642 +0000 UTC m=+403.217643666" watchObservedRunningTime="2026-01-20 18:36:50.29951701 +0000 UTC m=+403.221330024" Jan 20 18:36:51 crc kubenswrapper[4773]: I0120 18:36:51.133517 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:51 crc kubenswrapper[4773]: I0120 18:36:51.133813 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:51 crc kubenswrapper[4773]: I0120 18:36:51.184605 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:51 crc kubenswrapper[4773]: I0120 18:36:51.324490 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lnll4" Jan 20 18:36:52 crc kubenswrapper[4773]: I0120 18:36:52.110344 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:52 crc kubenswrapper[4773]: I0120 18:36:52.110401 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:52 crc kubenswrapper[4773]: I0120 18:36:52.150919 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:52 crc kubenswrapper[4773]: I0120 18:36:52.334730 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wdwbg" Jan 20 18:36:53 crc kubenswrapper[4773]: I0120 18:36:53.507255 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:53 crc kubenswrapper[4773]: I0120 18:36:53.507737 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:53 crc kubenswrapper[4773]: I0120 18:36:53.556016 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:54 crc kubenswrapper[4773]: I0120 18:36:54.342023 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-r24nn" Jan 20 18:36:54 crc kubenswrapper[4773]: I0120 18:36:54.511816 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:54 crc kubenswrapper[4773]: I0120 18:36:54.511864 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:54 crc kubenswrapper[4773]: I0120 18:36:54.552611 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:55 crc kubenswrapper[4773]: I0120 18:36:55.346752 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-cfwbf" Jan 20 18:36:58 crc kubenswrapper[4773]: I0120 18:36:58.171391 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:36:58 crc kubenswrapper[4773]: I0120 18:36:58.171811 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:36:58 crc kubenswrapper[4773]: I0120 18:36:58.171879 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:36:58 crc kubenswrapper[4773]: I0120 18:36:58.172646 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6fc4791a7cdd1fbda5ec9ded78a8be5e2c44fe7359d840cfe4a8ada84728d5d6"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:36:58 crc kubenswrapper[4773]: I0120 18:36:58.172719 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://6fc4791a7cdd1fbda5ec9ded78a8be5e2c44fe7359d840cfe4a8ada84728d5d6" gracePeriod=600 Jan 20 18:37:00 crc kubenswrapper[4773]: I0120 18:37:00.333385 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="6fc4791a7cdd1fbda5ec9ded78a8be5e2c44fe7359d840cfe4a8ada84728d5d6" exitCode=0 Jan 20 18:37:00 crc kubenswrapper[4773]: I0120 18:37:00.333442 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"6fc4791a7cdd1fbda5ec9ded78a8be5e2c44fe7359d840cfe4a8ada84728d5d6"} Jan 20 18:37:00 crc kubenswrapper[4773]: I0120 18:37:00.333851 4773 scope.go:117] "RemoveContainer" containerID="3e8755cb7894ae2c1832f3b0b8d8a6e33d3d336d00ce68a2afe004d82304dd24" Jan 20 18:37:01 crc kubenswrapper[4773]: I0120 18:37:01.340202 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"8ab3757bc284c8c9f1f813b678bd9bbed50bf491e47d29da19e04261db6c0c92"} Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.285653 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" podUID="f751520b-bf3d-4226-8850-4b3346c43a6f" containerName="registry" containerID="cri-o://cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa" gracePeriod=30 Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.619102 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759326 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-trusted-ca\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759405 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-tls\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759454 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-bound-sa-token\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759484 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ss7d9\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-kube-api-access-ss7d9\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759510 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-certificates\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759534 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f751520b-bf3d-4226-8850-4b3346c43a6f-installation-pull-secrets\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759585 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f751520b-bf3d-4226-8850-4b3346c43a6f-ca-trust-extracted\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.759756 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"f751520b-bf3d-4226-8850-4b3346c43a6f\" (UID: \"f751520b-bf3d-4226-8850-4b3346c43a6f\") " Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.760536 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.760577 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.765240 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-kube-api-access-ss7d9" (OuterVolumeSpecName: "kube-api-access-ss7d9") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "kube-api-access-ss7d9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.766496 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.766757 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f751520b-bf3d-4226-8850-4b3346c43a6f-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.766987 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.772376 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.791543 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f751520b-bf3d-4226-8850-4b3346c43a6f-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f751520b-bf3d-4226-8850-4b3346c43a6f" (UID: "f751520b-bf3d-4226-8850-4b3346c43a6f"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860646 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860686 4773 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860698 4773 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860711 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ss7d9\" (UniqueName: \"kubernetes.io/projected/f751520b-bf3d-4226-8850-4b3346c43a6f-kube-api-access-ss7d9\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860722 4773 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f751520b-bf3d-4226-8850-4b3346c43a6f-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860735 4773 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f751520b-bf3d-4226-8850-4b3346c43a6f-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:10 crc kubenswrapper[4773]: I0120 18:37:10.860745 4773 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f751520b-bf3d-4226-8850-4b3346c43a6f-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.390484 4773 generic.go:334] "Generic (PLEG): container finished" podID="f751520b-bf3d-4226-8850-4b3346c43a6f" containerID="cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa" exitCode=0 Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.390551 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" event={"ID":"f751520b-bf3d-4226-8850-4b3346c43a6f","Type":"ContainerDied","Data":"cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa"} Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.390824 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" event={"ID":"f751520b-bf3d-4226-8850-4b3346c43a6f","Type":"ContainerDied","Data":"4717aabd05ca8421c098accb226b89152753529be1fa867b484287b5c5a81ae7"} Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.390573 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-kr4zh" Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.390844 4773 scope.go:117] "RemoveContainer" containerID="cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa" Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.422669 4773 scope.go:117] "RemoveContainer" containerID="cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa" Jan 20 18:37:11 crc kubenswrapper[4773]: E0120 18:37:11.423312 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa\": container with ID starting with cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa not found: ID does not exist" containerID="cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa" Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.423361 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa"} err="failed to get container status \"cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa\": rpc error: code = NotFound desc = could not find container \"cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa\": container with ID starting with cc2e80ac4554c3a93472ea7caaec60741425f26f2058c6b98e87e53a2d2665aa not found: ID does not exist" Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.435572 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kr4zh"] Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.440004 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-kr4zh"] Jan 20 18:37:11 crc kubenswrapper[4773]: I0120 18:37:11.457221 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f751520b-bf3d-4226-8850-4b3346c43a6f" path="/var/lib/kubelet/pods/f751520b-bf3d-4226-8850-4b3346c43a6f/volumes" Jan 20 18:39:28 crc kubenswrapper[4773]: I0120 18:39:28.169995 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:39:28 crc kubenswrapper[4773]: I0120 18:39:28.170654 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:39:58 crc kubenswrapper[4773]: I0120 18:39:58.170607 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:39:58 crc kubenswrapper[4773]: I0120 18:39:58.171114 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:40:07 crc kubenswrapper[4773]: I0120 18:40:07.618612 4773 scope.go:117] "RemoveContainer" containerID="12809ae984dc039517fe1d4003dbfb41a5b5900acaed078c32869d8cdfb24334" Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.170573 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.171680 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.171756 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.172639 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8ab3757bc284c8c9f1f813b678bd9bbed50bf491e47d29da19e04261db6c0c92"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.172716 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://8ab3757bc284c8c9f1f813b678bd9bbed50bf491e47d29da19e04261db6c0c92" gracePeriod=600 Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.501647 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="8ab3757bc284c8c9f1f813b678bd9bbed50bf491e47d29da19e04261db6c0c92" exitCode=0 Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.501727 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"8ab3757bc284c8c9f1f813b678bd9bbed50bf491e47d29da19e04261db6c0c92"} Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.502247 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"714571a77485b95d4127b785d445e091e7d21c1d67336a6816b862641584bfce"} Jan 20 18:40:28 crc kubenswrapper[4773]: I0120 18:40:28.502269 4773 scope.go:117] "RemoveContainer" containerID="6fc4791a7cdd1fbda5ec9ded78a8be5e2c44fe7359d840cfe4a8ada84728d5d6" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.224813 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb"] Jan 20 18:41:43 crc kubenswrapper[4773]: E0120 18:41:43.225849 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f751520b-bf3d-4226-8850-4b3346c43a6f" containerName="registry" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.225865 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f751520b-bf3d-4226-8850-4b3346c43a6f" containerName="registry" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.226007 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f751520b-bf3d-4226-8850-4b3346c43a6f" containerName="registry" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.226480 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.228222 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.233656 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb"] Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.234355 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.234567 4773 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-bqkk8" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.275012 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-tgrsg"] Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.275675 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-tgrsg" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.298195 4773 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-6g44h" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.308096 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cf2ql"] Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.308813 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.318332 4773 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-vv925" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.323924 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-tgrsg"] Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.329395 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cf2ql"] Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.412164 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2bdj\" (UniqueName: \"kubernetes.io/projected/4380dd47-7110-43ea-af85-02675b558a8d-kube-api-access-p2bdj\") pod \"cert-manager-webhook-687f57d79b-cf2ql\" (UID: \"4380dd47-7110-43ea-af85-02675b558a8d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.412590 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmnht\" (UniqueName: \"kubernetes.io/projected/5a2416cd-d7d8-4aa5-b7ef-1b61446a4072-kube-api-access-lmnht\") pod \"cert-manager-858654f9db-tgrsg\" (UID: \"5a2416cd-d7d8-4aa5-b7ef-1b61446a4072\") " pod="cert-manager/cert-manager-858654f9db-tgrsg" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.412652 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlt7q\" (UniqueName: \"kubernetes.io/projected/c249258b-878c-45b8-9886-6fee2afec18c-kube-api-access-dlt7q\") pod \"cert-manager-cainjector-cf98fcc89-mtkdb\" (UID: \"c249258b-878c-45b8-9886-6fee2afec18c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.513444 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmnht\" (UniqueName: \"kubernetes.io/projected/5a2416cd-d7d8-4aa5-b7ef-1b61446a4072-kube-api-access-lmnht\") pod \"cert-manager-858654f9db-tgrsg\" (UID: \"5a2416cd-d7d8-4aa5-b7ef-1b61446a4072\") " pod="cert-manager/cert-manager-858654f9db-tgrsg" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.513696 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlt7q\" (UniqueName: \"kubernetes.io/projected/c249258b-878c-45b8-9886-6fee2afec18c-kube-api-access-dlt7q\") pod \"cert-manager-cainjector-cf98fcc89-mtkdb\" (UID: \"c249258b-878c-45b8-9886-6fee2afec18c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.513792 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2bdj\" (UniqueName: \"kubernetes.io/projected/4380dd47-7110-43ea-af85-02675b558a8d-kube-api-access-p2bdj\") pod \"cert-manager-webhook-687f57d79b-cf2ql\" (UID: \"4380dd47-7110-43ea-af85-02675b558a8d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.532132 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2bdj\" (UniqueName: \"kubernetes.io/projected/4380dd47-7110-43ea-af85-02675b558a8d-kube-api-access-p2bdj\") pod \"cert-manager-webhook-687f57d79b-cf2ql\" (UID: \"4380dd47-7110-43ea-af85-02675b558a8d\") " pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.532206 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlt7q\" (UniqueName: \"kubernetes.io/projected/c249258b-878c-45b8-9886-6fee2afec18c-kube-api-access-dlt7q\") pod \"cert-manager-cainjector-cf98fcc89-mtkdb\" (UID: \"c249258b-878c-45b8-9886-6fee2afec18c\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.532904 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmnht\" (UniqueName: \"kubernetes.io/projected/5a2416cd-d7d8-4aa5-b7ef-1b61446a4072-kube-api-access-lmnht\") pod \"cert-manager-858654f9db-tgrsg\" (UID: \"5a2416cd-d7d8-4aa5-b7ef-1b61446a4072\") " pod="cert-manager/cert-manager-858654f9db-tgrsg" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.589899 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.605043 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-tgrsg" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.626343 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.860050 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-cf2ql"] Jan 20 18:41:43 crc kubenswrapper[4773]: W0120 18:41:43.865867 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4380dd47_7110_43ea_af85_02675b558a8d.slice/crio-46a03610c837970274f8a55cc30070ec105b63c813b6061e66fd9f385491aa58 WatchSource:0}: Error finding container 46a03610c837970274f8a55cc30070ec105b63c813b6061e66fd9f385491aa58: Status 404 returned error can't find the container with id 46a03610c837970274f8a55cc30070ec105b63c813b6061e66fd9f385491aa58 Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.868608 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 18:41:43 crc kubenswrapper[4773]: I0120 18:41:43.944732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" event={"ID":"4380dd47-7110-43ea-af85-02675b558a8d","Type":"ContainerStarted","Data":"46a03610c837970274f8a55cc30070ec105b63c813b6061e66fd9f385491aa58"} Jan 20 18:41:44 crc kubenswrapper[4773]: I0120 18:41:44.005345 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb"] Jan 20 18:41:44 crc kubenswrapper[4773]: W0120 18:41:44.014116 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc249258b_878c_45b8_9886_6fee2afec18c.slice/crio-c786dfe8d29e3fb8b9c97c31d4bd253be66169af05cd97c5334f7610d55f7dd5 WatchSource:0}: Error finding container c786dfe8d29e3fb8b9c97c31d4bd253be66169af05cd97c5334f7610d55f7dd5: Status 404 returned error can't find the container with id c786dfe8d29e3fb8b9c97c31d4bd253be66169af05cd97c5334f7610d55f7dd5 Jan 20 18:41:44 crc kubenswrapper[4773]: I0120 18:41:44.016735 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-tgrsg"] Jan 20 18:41:44 crc kubenswrapper[4773]: I0120 18:41:44.953128 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" event={"ID":"c249258b-878c-45b8-9886-6fee2afec18c","Type":"ContainerStarted","Data":"c786dfe8d29e3fb8b9c97c31d4bd253be66169af05cd97c5334f7610d55f7dd5"} Jan 20 18:41:44 crc kubenswrapper[4773]: I0120 18:41:44.954831 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-tgrsg" event={"ID":"5a2416cd-d7d8-4aa5-b7ef-1b61446a4072","Type":"ContainerStarted","Data":"28cf9898e70669c725b9c2810417ee7dc210bc16a3009d39f1cdd8ab2d8bf58c"} Jan 20 18:41:47 crc kubenswrapper[4773]: I0120 18:41:47.972082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" event={"ID":"c249258b-878c-45b8-9886-6fee2afec18c","Type":"ContainerStarted","Data":"9a30b9c50f7aeeaa6ae0d52604a4d4c8af3ce57e87e93619a72a5f2128e1afc8"} Jan 20 18:41:47 crc kubenswrapper[4773]: I0120 18:41:47.976394 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-tgrsg" event={"ID":"5a2416cd-d7d8-4aa5-b7ef-1b61446a4072","Type":"ContainerStarted","Data":"9cb2086f691a8de509ff9cbcdf82f2e873c82528627934d8778da4604d10c780"} Jan 20 18:41:47 crc kubenswrapper[4773]: I0120 18:41:47.978068 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" event={"ID":"4380dd47-7110-43ea-af85-02675b558a8d","Type":"ContainerStarted","Data":"b33be19286c19c9f58023e7d1593d6a556295876b278ebd9597c889115a3ca9a"} Jan 20 18:41:47 crc kubenswrapper[4773]: I0120 18:41:47.994323 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-mtkdb" podStartSLOduration=2.198339306 podStartE2EDuration="4.994290543s" podCreationTimestamp="2026-01-20 18:41:43 +0000 UTC" firstStartedPulling="2026-01-20 18:41:44.017682972 +0000 UTC m=+696.939495996" lastFinishedPulling="2026-01-20 18:41:46.813634209 +0000 UTC m=+699.735447233" observedRunningTime="2026-01-20 18:41:47.988233859 +0000 UTC m=+700.910046893" watchObservedRunningTime="2026-01-20 18:41:47.994290543 +0000 UTC m=+700.916103607" Jan 20 18:41:48 crc kubenswrapper[4773]: I0120 18:41:48.024365 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" podStartSLOduration=2.12748582 podStartE2EDuration="5.024345864s" podCreationTimestamp="2026-01-20 18:41:43 +0000 UTC" firstStartedPulling="2026-01-20 18:41:43.868249633 +0000 UTC m=+696.790062667" lastFinishedPulling="2026-01-20 18:41:46.765109687 +0000 UTC m=+699.686922711" observedRunningTime="2026-01-20 18:41:48.022708244 +0000 UTC m=+700.944521278" watchObservedRunningTime="2026-01-20 18:41:48.024345864 +0000 UTC m=+700.946158888" Jan 20 18:41:48 crc kubenswrapper[4773]: I0120 18:41:48.024873 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-tgrsg" podStartSLOduration=1.2577275700000001 podStartE2EDuration="5.024867186s" podCreationTimestamp="2026-01-20 18:41:43 +0000 UTC" firstStartedPulling="2026-01-20 18:41:44.025253523 +0000 UTC m=+696.947066547" lastFinishedPulling="2026-01-20 18:41:47.792393139 +0000 UTC m=+700.714206163" observedRunningTime="2026-01-20 18:41:48.009379355 +0000 UTC m=+700.931192379" watchObservedRunningTime="2026-01-20 18:41:48.024867186 +0000 UTC m=+700.946680210" Jan 20 18:41:48 crc kubenswrapper[4773]: I0120 18:41:48.627203 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:53 crc kubenswrapper[4773]: I0120 18:41:53.629116 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-cf2ql" Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.185451 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qt89w"] Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.185908 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-controller" containerID="cri-o://dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.185974 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.185967 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="nbdb" containerID="cri-o://aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.186053 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="sbdb" containerID="cri-o://68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.186065 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-acl-logging" containerID="cri-o://7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.186050 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-node" containerID="cri-o://a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.186098 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="northd" containerID="cri-o://8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1" gracePeriod=30 Jan 20 18:41:58 crc kubenswrapper[4773]: I0120 18:41:58.279154 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" containerID="cri-o://5d2aab5769291bf8517a6be58643ec33bb4a9c92e32ef5c6a6be6258a94a21e0" gracePeriod=30 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.041419 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovnkube-controller/3.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.047019 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovn-acl-logging/0.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.047762 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovn-controller/0.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048675 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="5d2aab5769291bf8517a6be58643ec33bb4a9c92e32ef5c6a6be6258a94a21e0" exitCode=0 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048718 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672" exitCode=0 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048728 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467" exitCode=0 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048736 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1" exitCode=0 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048745 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5" exitCode=0 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048741 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"5d2aab5769291bf8517a6be58643ec33bb4a9c92e32ef5c6a6be6258a94a21e0"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048753 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e" exitCode=0 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048762 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc" exitCode=143 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048771 4773 generic.go:334] "Generic (PLEG): container finished" podID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerID="dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455" exitCode=143 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048776 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048788 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048798 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048807 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048816 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048825 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.048833 4773 scope.go:117] "RemoveContainer" containerID="2769f74bbb44f1c101c7e8101d7d9653de865bbf128573473d1780c9571bee67" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.050814 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/2.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.051379 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/1.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.051418 4773 generic.go:334] "Generic (PLEG): container finished" podID="061a607e-1868-4fcf-b3ea-d51157511d41" containerID="12757148bdfa862c997ae9700dd354a77024b7a40e5d5398f8af800d1a220e65" exitCode=2 Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.051454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerDied","Data":"12757148bdfa862c997ae9700dd354a77024b7a40e5d5398f8af800d1a220e65"} Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.052063 4773 scope.go:117] "RemoveContainer" containerID="12757148bdfa862c997ae9700dd354a77024b7a40e5d5398f8af800d1a220e65" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.052455 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bccxn_openshift-multus(061a607e-1868-4fcf-b3ea-d51157511d41)\"" pod="openshift-multus/multus-bccxn" podUID="061a607e-1868-4fcf-b3ea-d51157511d41" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.055921 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovn-acl-logging/0.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.056590 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovn-controller/0.log" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.057412 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.071770 4773 scope.go:117] "RemoveContainer" containerID="dc586816975c68b2e0607a33f40a1ef6b74f4a1267fb305584da3158ea91bdc7" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166007 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8v62z"] Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166300 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-acl-logging" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166318 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-acl-logging" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166333 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166341 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166349 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166360 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166373 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166380 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166388 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-ovn-metrics" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166397 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-ovn-metrics" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166412 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-node" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166419 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-node" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166429 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="nbdb" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166436 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="nbdb" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166445 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166452 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166461 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="northd" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166468 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="northd" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166479 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="sbdb" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166486 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="sbdb" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166496 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166503 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166513 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166520 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: E0120 18:41:59.166530 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kubecfg-setup" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166536 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kubecfg-setup" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166640 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166654 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="northd" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166677 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166685 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="nbdb" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166696 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-ovn-metrics" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166706 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166714 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovn-acl-logging" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166725 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="kube-rbac-proxy-node" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166733 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="sbdb" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166741 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166751 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.166988 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" containerName="ovnkube-controller" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.169001 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178073 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-var-lib-cni-networks-ovn-kubernetes\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178108 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-bin\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178146 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-env-overrides\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178170 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-kubelet\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178190 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-var-lib-openvswitch\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178224 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-ovn\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178274 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-script-lib\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178282 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178311 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-systemd\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178306 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178303 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178371 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-systemd-units\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178399 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-config\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178390 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178437 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-node-log\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178445 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178457 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-openvswitch\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178488 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-netns\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178505 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-netd\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178525 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovn-node-metrics-cert\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178549 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-slash\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178562 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-etc-openvswitch\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178579 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-ovn-kubernetes\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178607 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-log-socket\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178627 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9flh4\" (UniqueName: \"kubernetes.io/projected/f354424d-7f22-42d6-8bd9-00e32e78c3d3-kube-api-access-9flh4\") pod \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\" (UID: \"f354424d-7f22-42d6-8bd9-00e32e78c3d3\") " Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178655 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178684 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178773 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178783 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178792 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178817 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-slash" (OuterVolumeSpecName: "host-slash") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178840 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178845 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-log-socket" (OuterVolumeSpecName: "log-socket") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178858 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178853 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.178869 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-node-log" (OuterVolumeSpecName: "node-log") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179845 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179867 4773 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179879 4773 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-node-log\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179891 4773 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179902 4773 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179913 4773 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179924 4773 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-slash\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179951 4773 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179966 4773 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179977 4773 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-log-socket\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.179990 4773 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.180004 4773 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.180016 4773 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.180027 4773 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.180037 4773 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.180049 4773 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.180255 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.186092 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.194142 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f354424d-7f22-42d6-8bd9-00e32e78c3d3-kube-api-access-9flh4" (OuterVolumeSpecName: "kube-api-access-9flh4") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "kube-api-access-9flh4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.196492 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "f354424d-7f22-42d6-8bd9-00e32e78c3d3" (UID: "f354424d-7f22-42d6-8bd9-00e32e78c3d3"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281250 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281299 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-cni-netd\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281618 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovn-node-metrics-cert\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281691 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq5pj\" (UniqueName: \"kubernetes.io/projected/b635b367-c3be-4a4c-910d-e6806f1fa8c2-kube-api-access-lq5pj\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281723 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-log-socket\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281749 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-cni-bin\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281774 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-systemd-units\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281800 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-run-netns\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281819 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovnkube-script-lib\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281847 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-kubelet\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.281899 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-node-log\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-slash\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282032 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-ovn\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282071 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-var-lib-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282148 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-systemd\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282176 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282203 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovnkube-config\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282253 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-etc-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282314 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-run-ovn-kubernetes\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282334 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-env-overrides\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282416 4773 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/f354424d-7f22-42d6-8bd9-00e32e78c3d3-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282434 4773 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282447 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/f354424d-7f22-42d6-8bd9-00e32e78c3d3-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.282457 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9flh4\" (UniqueName: \"kubernetes.io/projected/f354424d-7f22-42d6-8bd9-00e32e78c3d3-kube-api-access-9flh4\") on node \"crc\" DevicePath \"\"" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383503 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-run-ovn-kubernetes\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383541 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-env-overrides\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383571 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383587 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-cni-netd\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383613 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovn-node-metrics-cert\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383637 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lq5pj\" (UniqueName: \"kubernetes.io/projected/b635b367-c3be-4a4c-910d-e6806f1fa8c2-kube-api-access-lq5pj\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383655 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-log-socket\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383677 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-cni-bin\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-systemd-units\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383729 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-run-netns\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383744 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovnkube-script-lib\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383763 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-kubelet\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383754 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-run-ovn-kubernetes\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383777 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-node-log\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383839 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-run-netns\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383868 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-systemd-units\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383909 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-slash\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383954 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-ovn\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383983 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-log-socket\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383821 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-node-log\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384020 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-var-lib-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384038 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-slash\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384059 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-cni-bin\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384065 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-ovn\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.383990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-var-lib-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384095 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-cni-netd\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384118 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384121 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-systemd\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384142 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-systemd\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384154 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384178 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-host-kubelet\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384188 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovnkube-config\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384210 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-etc-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384213 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-run-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384346 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b635b367-c3be-4a4c-910d-e6806f1fa8c2-etc-openvswitch\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384378 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-env-overrides\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384612 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovnkube-script-lib\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.384757 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovnkube-config\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.396619 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b635b367-c3be-4a4c-910d-e6806f1fa8c2-ovn-node-metrics-cert\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.398711 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq5pj\" (UniqueName: \"kubernetes.io/projected/b635b367-c3be-4a4c-910d-e6806f1fa8c2-kube-api-access-lq5pj\") pod \"ovnkube-node-8v62z\" (UID: \"b635b367-c3be-4a4c-910d-e6806f1fa8c2\") " pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:41:59 crc kubenswrapper[4773]: I0120 18:41:59.484201 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.056482 4773 generic.go:334] "Generic (PLEG): container finished" podID="b635b367-c3be-4a4c-910d-e6806f1fa8c2" containerID="5202eb208f6bcd0c12462bd64c86657a3a2e05f456081fc12024e21c61c6fafa" exitCode=0 Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.056578 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerDied","Data":"5202eb208f6bcd0c12462bd64c86657a3a2e05f456081fc12024e21c61c6fafa"} Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.056809 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"f841f89d19e69ccc55df86f1f23cba4354c74821862c09e5a2a2e721d71f0226"} Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.058418 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/2.log" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.067982 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovn-acl-logging/0.log" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.068547 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-qt89w_f354424d-7f22-42d6-8bd9-00e32e78c3d3/ovn-controller/0.log" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.069280 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" event={"ID":"f354424d-7f22-42d6-8bd9-00e32e78c3d3","Type":"ContainerDied","Data":"410001a1d3881fa68033cb522fb1036ff5be18d13872f61c3fe53b410c458aa8"} Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.069320 4773 scope.go:117] "RemoveContainer" containerID="5d2aab5769291bf8517a6be58643ec33bb4a9c92e32ef5c6a6be6258a94a21e0" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.069467 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-qt89w" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.088668 4773 scope.go:117] "RemoveContainer" containerID="68d98d68d006df3d69e253299966bef755778e0cdab71a9ff91d2829ae761672" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.105790 4773 scope.go:117] "RemoveContainer" containerID="aeac2ae7ead2dd1144c00709ea7162dfe1e360dc304726b031970bc989f11467" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.122454 4773 scope.go:117] "RemoveContainer" containerID="8c6fa189877752f110d0f9539d08a9693872f9ee0350c63384acae1dea6afbc1" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.143124 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qt89w"] Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.146706 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-qt89w"] Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.147095 4773 scope.go:117] "RemoveContainer" containerID="6d36ecbc289d1bc2ebcba42b6cfe938692dd9b7e2b4e1adac18ac86a273d6bf5" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.164263 4773 scope.go:117] "RemoveContainer" containerID="a000cfa332372a402d4d81eafc6f06b988196a3a847236a94d8166f0d745b07e" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.178161 4773 scope.go:117] "RemoveContainer" containerID="7767620b251e4c63c9397573c7915bd85831b833817ac6b939fc47d6304dc2bc" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.195212 4773 scope.go:117] "RemoveContainer" containerID="dc96ff67e107830b7e2fe8c3c7460ca9c7b22dfadab07ea69968162e9a1b4455" Jan 20 18:42:00 crc kubenswrapper[4773]: I0120 18:42:00.209292 4773 scope.go:117] "RemoveContainer" containerID="a67308b5676ba4c68391eb7239a1cbfd33b666ec32cea7a4c8fcc285992644f3" Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.078758 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"a1d309f906242bc9d36f344d7cd7ef9ee8576929e5b70b0864734554998c1a8c"} Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.079105 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"ff2809d43fc44c116c17c14d3d0e4adf9714d79f66c190c5dafc76abf6ba41fb"} Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.079125 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"560aa7391debccc63ef5c53bd12cc1d3a266fb61c7cc61c6d7cea18bc76cfcbd"} Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.079135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"9716b731e54a2d3a0a72a9e9980c73e66c442d562912aad434720cbfdcb3a3a9"} Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.079145 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"98c7c76a029003f426c8c93eb026a64780491014bbfe37fa8f6744b54b5bd91b"} Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.079157 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"0b606de5b7543afaf650b56755321438f47d8b3b4562b6cf5debb0159790000e"} Jan 20 18:42:01 crc kubenswrapper[4773]: I0120 18:42:01.458971 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f354424d-7f22-42d6-8bd9-00e32e78c3d3" path="/var/lib/kubelet/pods/f354424d-7f22-42d6-8bd9-00e32e78c3d3/volumes" Jan 20 18:42:04 crc kubenswrapper[4773]: I0120 18:42:04.117920 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"7484afb2b06bc09ea22344323c2eeeb1b389047c6d94d3699e9241b7d5e1e71d"} Jan 20 18:42:06 crc kubenswrapper[4773]: I0120 18:42:06.130847 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" event={"ID":"b635b367-c3be-4a4c-910d-e6806f1fa8c2","Type":"ContainerStarted","Data":"a3a35f9675871c14221068e27003457dec3edf918e6a77e10b6e2b7174ad7c2a"} Jan 20 18:42:06 crc kubenswrapper[4773]: I0120 18:42:06.131444 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:06 crc kubenswrapper[4773]: I0120 18:42:06.131463 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:06 crc kubenswrapper[4773]: I0120 18:42:06.157739 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" podStartSLOduration=7.15772332 podStartE2EDuration="7.15772332s" podCreationTimestamp="2026-01-20 18:41:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:42:06.155077267 +0000 UTC m=+719.076890321" watchObservedRunningTime="2026-01-20 18:42:06.15772332 +0000 UTC m=+719.079536344" Jan 20 18:42:06 crc kubenswrapper[4773]: I0120 18:42:06.162605 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:07 crc kubenswrapper[4773]: I0120 18:42:07.136616 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:07 crc kubenswrapper[4773]: I0120 18:42:07.160531 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:14 crc kubenswrapper[4773]: I0120 18:42:14.446732 4773 scope.go:117] "RemoveContainer" containerID="12757148bdfa862c997ae9700dd354a77024b7a40e5d5398f8af800d1a220e65" Jan 20 18:42:14 crc kubenswrapper[4773]: E0120 18:42:14.447785 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-bccxn_openshift-multus(061a607e-1868-4fcf-b3ea-d51157511d41)\"" pod="openshift-multus/multus-bccxn" podUID="061a607e-1868-4fcf-b3ea-d51157511d41" Jan 20 18:42:28 crc kubenswrapper[4773]: I0120 18:42:28.170088 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:42:28 crc kubenswrapper[4773]: I0120 18:42:28.170666 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:42:29 crc kubenswrapper[4773]: I0120 18:42:29.447642 4773 scope.go:117] "RemoveContainer" containerID="12757148bdfa862c997ae9700dd354a77024b7a40e5d5398f8af800d1a220e65" Jan 20 18:42:29 crc kubenswrapper[4773]: I0120 18:42:29.519592 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8v62z" Jan 20 18:42:31 crc kubenswrapper[4773]: I0120 18:42:31.515088 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/2.log" Jan 20 18:42:31 crc kubenswrapper[4773]: I0120 18:42:31.515531 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bccxn" event={"ID":"061a607e-1868-4fcf-b3ea-d51157511d41","Type":"ContainerStarted","Data":"c77ae071f927cd88aa64db8f3abf5d005c367450d23cbc29cab36cef26ece07f"} Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.944190 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd"] Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.951019 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.956134 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.957371 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.957467 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvzr7\" (UniqueName: \"kubernetes.io/projected/7c79fd0a-1d41-44db-8ee4-d5781d77e848-kube-api-access-vvzr7\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.957495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:34 crc kubenswrapper[4773]: I0120 18:42:34.975065 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd"] Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.058149 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.058213 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvzr7\" (UniqueName: \"kubernetes.io/projected/7c79fd0a-1d41-44db-8ee4-d5781d77e848-kube-api-access-vvzr7\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.058234 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.058780 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-util\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.058781 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-bundle\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.077951 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvzr7\" (UniqueName: \"kubernetes.io/projected/7c79fd0a-1d41-44db-8ee4-d5781d77e848-kube-api-access-vvzr7\") pod \"53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.297598 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:35 crc kubenswrapper[4773]: I0120 18:42:35.708861 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd"] Jan 20 18:42:36 crc kubenswrapper[4773]: I0120 18:42:36.549219 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" event={"ID":"7c79fd0a-1d41-44db-8ee4-d5781d77e848","Type":"ContainerStarted","Data":"02e86264329143bc774f750712bb4cf75f1a1d97aac372a64a9b23990a2955da"} Jan 20 18:42:36 crc kubenswrapper[4773]: I0120 18:42:36.549484 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" event={"ID":"7c79fd0a-1d41-44db-8ee4-d5781d77e848","Type":"ContainerStarted","Data":"d46197f82e8a656149ec058f82753003d495e34c37ab184b666a9f1929fd41f9"} Jan 20 18:42:38 crc kubenswrapper[4773]: I0120 18:42:38.560342 4773 generic.go:334] "Generic (PLEG): container finished" podID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerID="02e86264329143bc774f750712bb4cf75f1a1d97aac372a64a9b23990a2955da" exitCode=0 Jan 20 18:42:38 crc kubenswrapper[4773]: I0120 18:42:38.560536 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" event={"ID":"7c79fd0a-1d41-44db-8ee4-d5781d77e848","Type":"ContainerDied","Data":"02e86264329143bc774f750712bb4cf75f1a1d97aac372a64a9b23990a2955da"} Jan 20 18:42:40 crc kubenswrapper[4773]: I0120 18:42:40.574525 4773 generic.go:334] "Generic (PLEG): container finished" podID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerID="de9628e801a8e3e5a8d60037c53c68268d9f50f78b2c45d59569034e60ec6a03" exitCode=0 Jan 20 18:42:40 crc kubenswrapper[4773]: I0120 18:42:40.574602 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" event={"ID":"7c79fd0a-1d41-44db-8ee4-d5781d77e848","Type":"ContainerDied","Data":"de9628e801a8e3e5a8d60037c53c68268d9f50f78b2c45d59569034e60ec6a03"} Jan 20 18:42:41 crc kubenswrapper[4773]: I0120 18:42:41.582803 4773 generic.go:334] "Generic (PLEG): container finished" podID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerID="915ffdd09d2d3d224217f7fd2081300d46578fa9c61fc8af3b6eb9d81b2e1d7c" exitCode=0 Jan 20 18:42:41 crc kubenswrapper[4773]: I0120 18:42:41.582857 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" event={"ID":"7c79fd0a-1d41-44db-8ee4-d5781d77e848","Type":"ContainerDied","Data":"915ffdd09d2d3d224217f7fd2081300d46578fa9c61fc8af3b6eb9d81b2e1d7c"} Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.002172 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-n4nm5"] Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.003846 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.024834 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4nm5"] Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.082562 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-catalog-content\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.082614 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsnqc\" (UniqueName: \"kubernetes.io/projected/e6959fd3-1296-4909-b0c1-0803e1e7b098-kube-api-access-bsnqc\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.082651 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-utilities\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.183829 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-utilities\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.183954 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-catalog-content\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.183992 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsnqc\" (UniqueName: \"kubernetes.io/projected/e6959fd3-1296-4909-b0c1-0803e1e7b098-kube-api-access-bsnqc\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.184366 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-utilities\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.184808 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-catalog-content\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.217236 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsnqc\" (UniqueName: \"kubernetes.io/projected/e6959fd3-1296-4909-b0c1-0803e1e7b098-kube-api-access-bsnqc\") pod \"redhat-operators-n4nm5\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.319476 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.726113 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-n4nm5"] Jan 20 18:42:42 crc kubenswrapper[4773]: W0120 18:42:42.741220 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6959fd3_1296_4909_b0c1_0803e1e7b098.slice/crio-e9481dd1fada9d39b60204838bac0a2b76e59756c820f74b2a680486ffce533a WatchSource:0}: Error finding container e9481dd1fada9d39b60204838bac0a2b76e59756c820f74b2a680486ffce533a: Status 404 returned error can't find the container with id e9481dd1fada9d39b60204838bac0a2b76e59756c820f74b2a680486ffce533a Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.803102 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.996461 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-bundle\") pod \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.996517 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvzr7\" (UniqueName: \"kubernetes.io/projected/7c79fd0a-1d41-44db-8ee4-d5781d77e848-kube-api-access-vvzr7\") pod \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.996545 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-util\") pod \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\" (UID: \"7c79fd0a-1d41-44db-8ee4-d5781d77e848\") " Jan 20 18:42:42 crc kubenswrapper[4773]: I0120 18:42:42.997512 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-bundle" (OuterVolumeSpecName: "bundle") pod "7c79fd0a-1d41-44db-8ee4-d5781d77e848" (UID: "7c79fd0a-1d41-44db-8ee4-d5781d77e848"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.000804 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c79fd0a-1d41-44db-8ee4-d5781d77e848-kube-api-access-vvzr7" (OuterVolumeSpecName: "kube-api-access-vvzr7") pod "7c79fd0a-1d41-44db-8ee4-d5781d77e848" (UID: "7c79fd0a-1d41-44db-8ee4-d5781d77e848"). InnerVolumeSpecName "kube-api-access-vvzr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.007512 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-util" (OuterVolumeSpecName: "util") pod "7c79fd0a-1d41-44db-8ee4-d5781d77e848" (UID: "7c79fd0a-1d41-44db-8ee4-d5781d77e848"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.097760 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.098040 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvzr7\" (UniqueName: \"kubernetes.io/projected/7c79fd0a-1d41-44db-8ee4-d5781d77e848-kube-api-access-vvzr7\") on node \"crc\" DevicePath \"\"" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.098146 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7c79fd0a-1d41-44db-8ee4-d5781d77e848-util\") on node \"crc\" DevicePath \"\"" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.538568 4773 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.594425 4773 generic.go:334] "Generic (PLEG): container finished" podID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerID="db6475baee2b330e6cee2961ede1d25c44221d7af154400d2652eb5c6d58e558" exitCode=0 Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.594481 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4nm5" event={"ID":"e6959fd3-1296-4909-b0c1-0803e1e7b098","Type":"ContainerDied","Data":"db6475baee2b330e6cee2961ede1d25c44221d7af154400d2652eb5c6d58e558"} Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.594526 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4nm5" event={"ID":"e6959fd3-1296-4909-b0c1-0803e1e7b098","Type":"ContainerStarted","Data":"e9481dd1fada9d39b60204838bac0a2b76e59756c820f74b2a680486ffce533a"} Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.598569 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" event={"ID":"7c79fd0a-1d41-44db-8ee4-d5781d77e848","Type":"ContainerDied","Data":"d46197f82e8a656149ec058f82753003d495e34c37ab184b666a9f1929fd41f9"} Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.598615 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d46197f82e8a656149ec058f82753003d495e34c37ab184b666a9f1929fd41f9" Jan 20 18:42:43 crc kubenswrapper[4773]: I0120 18:42:43.598615 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.609430 4773 generic.go:334] "Generic (PLEG): container finished" podID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerID="5e754401cfd5accf3b020f80444b146055208f2c923e46a1f3deb9641bd7242a" exitCode=0 Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.609758 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4nm5" event={"ID":"e6959fd3-1296-4909-b0c1-0803e1e7b098","Type":"ContainerDied","Data":"5e754401cfd5accf3b020f80444b146055208f2c923e46a1f3deb9641bd7242a"} Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.655614 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-bwjbw"] Jan 20 18:42:45 crc kubenswrapper[4773]: E0120 18:42:45.655863 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="extract" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.655883 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="extract" Jan 20 18:42:45 crc kubenswrapper[4773]: E0120 18:42:45.655898 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="pull" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.655907 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="pull" Jan 20 18:42:45 crc kubenswrapper[4773]: E0120 18:42:45.655949 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="util" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.655961 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="util" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.656093 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c79fd0a-1d41-44db-8ee4-d5781d77e848" containerName="extract" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.656521 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.658395 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.658618 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.659805 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-6z7v9" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.670067 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-bwjbw"] Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.833402 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8x57\" (UniqueName: \"kubernetes.io/projected/a0e928f6-ac84-4903-ab0e-08557dea077f-kube-api-access-c8x57\") pod \"nmstate-operator-646758c888-bwjbw\" (UID: \"a0e928f6-ac84-4903-ab0e-08557dea077f\") " pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.935197 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c8x57\" (UniqueName: \"kubernetes.io/projected/a0e928f6-ac84-4903-ab0e-08557dea077f-kube-api-access-c8x57\") pod \"nmstate-operator-646758c888-bwjbw\" (UID: \"a0e928f6-ac84-4903-ab0e-08557dea077f\") " pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.954063 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8x57\" (UniqueName: \"kubernetes.io/projected/a0e928f6-ac84-4903-ab0e-08557dea077f-kube-api-access-c8x57\") pod \"nmstate-operator-646758c888-bwjbw\" (UID: \"a0e928f6-ac84-4903-ab0e-08557dea077f\") " pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" Jan 20 18:42:45 crc kubenswrapper[4773]: I0120 18:42:45.970785 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" Jan 20 18:42:46 crc kubenswrapper[4773]: I0120 18:42:46.377992 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-646758c888-bwjbw"] Jan 20 18:42:46 crc kubenswrapper[4773]: I0120 18:42:46.616283 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4nm5" event={"ID":"e6959fd3-1296-4909-b0c1-0803e1e7b098","Type":"ContainerStarted","Data":"d1df9d70f10270992bed9e5be25ff4a7e9bb7cccf65bb0859efc558329329c47"} Jan 20 18:42:46 crc kubenswrapper[4773]: I0120 18:42:46.617143 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" event={"ID":"a0e928f6-ac84-4903-ab0e-08557dea077f","Type":"ContainerStarted","Data":"1bf72a9437e6bde709d32f7b15f7385fec46d8d0694cfb8db5733c191ce29ac8"} Jan 20 18:42:46 crc kubenswrapper[4773]: I0120 18:42:46.633156 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-n4nm5" podStartSLOduration=3.198467814 podStartE2EDuration="5.633139823s" podCreationTimestamp="2026-01-20 18:42:41 +0000 UTC" firstStartedPulling="2026-01-20 18:42:43.596962348 +0000 UTC m=+756.518775392" lastFinishedPulling="2026-01-20 18:42:46.031634377 +0000 UTC m=+758.953447401" observedRunningTime="2026-01-20 18:42:46.632596281 +0000 UTC m=+759.554409305" watchObservedRunningTime="2026-01-20 18:42:46.633139823 +0000 UTC m=+759.554952847" Jan 20 18:42:48 crc kubenswrapper[4773]: I0120 18:42:48.627813 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" event={"ID":"a0e928f6-ac84-4903-ab0e-08557dea077f","Type":"ContainerStarted","Data":"72a178d0f105384856984ab8cbfb4affcf3963a8e0e1bd5f2c9da1e94f3016e8"} Jan 20 18:42:48 crc kubenswrapper[4773]: I0120 18:42:48.644067 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-646758c888-bwjbw" podStartSLOduration=1.5989066250000001 podStartE2EDuration="3.644053941s" podCreationTimestamp="2026-01-20 18:42:45 +0000 UTC" firstStartedPulling="2026-01-20 18:42:46.382940307 +0000 UTC m=+759.304753331" lastFinishedPulling="2026-01-20 18:42:48.428087623 +0000 UTC m=+761.349900647" observedRunningTime="2026-01-20 18:42:48.641154402 +0000 UTC m=+761.562967426" watchObservedRunningTime="2026-01-20 18:42:48.644053941 +0000 UTC m=+761.565866965" Jan 20 18:42:52 crc kubenswrapper[4773]: I0120 18:42:52.319770 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:52 crc kubenswrapper[4773]: I0120 18:42:52.320305 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:52 crc kubenswrapper[4773]: I0120 18:42:52.368443 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:52 crc kubenswrapper[4773]: I0120 18:42:52.700808 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:53 crc kubenswrapper[4773]: I0120 18:42:53.192802 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4nm5"] Jan 20 18:42:54 crc kubenswrapper[4773]: I0120 18:42:54.662415 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-n4nm5" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="registry-server" containerID="cri-o://d1df9d70f10270992bed9e5be25ff4a7e9bb7cccf65bb0859efc558329329c47" gracePeriod=2 Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.180955 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-kxbnw"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.183434 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.186525 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-d5t26" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.187768 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.188911 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.191063 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-q9h42"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.191223 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.191825 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.202226 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-kxbnw"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.208735 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.298608 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.299471 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.301052 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.301636 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.301883 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-jwqzz" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.341764 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.348796 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgtc2\" (UniqueName: \"kubernetes.io/projected/ef435627-8918-4451-8d3a-23e494e29f56-kube-api-access-tgtc2\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.348884 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-dbus-socket\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.348908 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-nmstate-lock\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.348964 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9380b21a-b971-4bb9-9572-d795f171b941-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-8q4jc\" (UID: \"9380b21a-b971-4bb9-9572-d795f171b941\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.348985 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj7n4\" (UniqueName: \"kubernetes.io/projected/42822a21-0834-4fc7-aab5-4dcdf46f2786-kube-api-access-cj7n4\") pod \"nmstate-metrics-54757c584b-kxbnw\" (UID: \"42822a21-0834-4fc7-aab5-4dcdf46f2786\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.349004 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-ovs-socket\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.349024 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svsz4\" (UniqueName: \"kubernetes.io/projected/9380b21a-b971-4bb9-9572-d795f171b941-kube-api-access-svsz4\") pod \"nmstate-webhook-8474b5b9d8-8q4jc\" (UID: \"9380b21a-b971-4bb9-9572-d795f171b941\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450172 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-ovs-socket\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svsz4\" (UniqueName: \"kubernetes.io/projected/9380b21a-b971-4bb9-9572-d795f171b941-kube-api-access-svsz4\") pod \"nmstate-webhook-8474b5b9d8-8q4jc\" (UID: \"9380b21a-b971-4bb9-9572-d795f171b941\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450263 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgtc2\" (UniqueName: \"kubernetes.io/projected/ef435627-8918-4451-8d3a-23e494e29f56-kube-api-access-tgtc2\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450296 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/431c5397-9244-4083-9659-59210fd6d5c0-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450319 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/431c5397-9244-4083-9659-59210fd6d5c0-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450351 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-dbus-socket\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450374 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-nmstate-lock\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450400 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skppb\" (UniqueName: \"kubernetes.io/projected/431c5397-9244-4083-9659-59210fd6d5c0-kube-api-access-skppb\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450422 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9380b21a-b971-4bb9-9572-d795f171b941-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-8q4jc\" (UID: \"9380b21a-b971-4bb9-9572-d795f171b941\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450441 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj7n4\" (UniqueName: \"kubernetes.io/projected/42822a21-0834-4fc7-aab5-4dcdf46f2786-kube-api-access-cj7n4\") pod \"nmstate-metrics-54757c584b-kxbnw\" (UID: \"42822a21-0834-4fc7-aab5-4dcdf46f2786\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.450808 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-ovs-socket\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.451104 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-dbus-socket\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.451159 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/ef435627-8918-4451-8d3a-23e494e29f56-nmstate-lock\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.457560 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/9380b21a-b971-4bb9-9572-d795f171b941-tls-key-pair\") pod \"nmstate-webhook-8474b5b9d8-8q4jc\" (UID: \"9380b21a-b971-4bb9-9572-d795f171b941\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.473349 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svsz4\" (UniqueName: \"kubernetes.io/projected/9380b21a-b971-4bb9-9572-d795f171b941-kube-api-access-svsz4\") pod \"nmstate-webhook-8474b5b9d8-8q4jc\" (UID: \"9380b21a-b971-4bb9-9572-d795f171b941\") " pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.473513 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj7n4\" (UniqueName: \"kubernetes.io/projected/42822a21-0834-4fc7-aab5-4dcdf46f2786-kube-api-access-cj7n4\") pod \"nmstate-metrics-54757c584b-kxbnw\" (UID: \"42822a21-0834-4fc7-aab5-4dcdf46f2786\") " pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.480150 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgtc2\" (UniqueName: \"kubernetes.io/projected/ef435627-8918-4451-8d3a-23e494e29f56-kube-api-access-tgtc2\") pod \"nmstate-handler-q9h42\" (UID: \"ef435627-8918-4451-8d3a-23e494e29f56\") " pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.500202 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5d5d894b64-qtlgh"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.500924 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.507769 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.514014 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d5d894b64-qtlgh"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.524587 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.529369 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552039 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z66vg\" (UniqueName: \"kubernetes.io/projected/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-kube-api-access-z66vg\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552086 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/431c5397-9244-4083-9659-59210fd6d5c0-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552119 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/431c5397-9244-4083-9659-59210fd6d5c0-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552203 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-oauth-serving-cert\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552284 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-service-ca\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552420 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skppb\" (UniqueName: \"kubernetes.io/projected/431c5397-9244-4083-9659-59210fd6d5c0-kube-api-access-skppb\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552444 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-trusted-ca-bundle\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552492 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-oauth-config\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552530 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-serving-cert\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.552573 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-config\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.553009 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/431c5397-9244-4083-9659-59210fd6d5c0-nginx-conf\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.559543 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/431c5397-9244-4083-9659-59210fd6d5c0-plugin-serving-cert\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.571424 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skppb\" (UniqueName: \"kubernetes.io/projected/431c5397-9244-4083-9659-59210fd6d5c0-kube-api-access-skppb\") pod \"nmstate-console-plugin-7754f76f8b-z2s7z\" (UID: \"431c5397-9244-4083-9659-59210fd6d5c0\") " pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.614403 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.653969 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-serving-cert\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.654384 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-config\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.654420 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z66vg\" (UniqueName: \"kubernetes.io/projected/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-kube-api-access-z66vg\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.654451 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-oauth-serving-cert\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.654470 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-service-ca\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.654526 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-trusted-ca-bundle\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.654553 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-oauth-config\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.655881 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-config\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.655989 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-oauth-serving-cert\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.656650 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-service-ca\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.657313 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-trusted-ca-bundle\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.659203 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-oauth-config\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.660205 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-console-serving-cert\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.669019 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-q9h42" event={"ID":"ef435627-8918-4451-8d3a-23e494e29f56","Type":"ContainerStarted","Data":"05eb12614a2150580ce00e48bbfc5e9ca06a56577d6ceeeb8f48ddc46ca9d8f4"} Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.673905 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z66vg\" (UniqueName: \"kubernetes.io/projected/e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d-kube-api-access-z66vg\") pod \"console-5d5d894b64-qtlgh\" (UID: \"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d\") " pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.830593 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.847431 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.912742 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-54757c584b-kxbnw"] Jan 20 18:42:55 crc kubenswrapper[4773]: I0120 18:42:55.980060 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc"] Jan 20 18:42:56 crc kubenswrapper[4773]: W0120 18:42:56.034609 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2c9dff2_2e8c_4b93_8bd0_b9eacce8ac7d.slice/crio-bf8b13a2dc90d5418f5bbd47b5643a7b8a0a1d289365d0111336cb3e921adf0d WatchSource:0}: Error finding container bf8b13a2dc90d5418f5bbd47b5643a7b8a0a1d289365d0111336cb3e921adf0d: Status 404 returned error can't find the container with id bf8b13a2dc90d5418f5bbd47b5643a7b8a0a1d289365d0111336cb3e921adf0d Jan 20 18:42:56 crc kubenswrapper[4773]: I0120 18:42:56.035786 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5d5d894b64-qtlgh"] Jan 20 18:42:56 crc kubenswrapper[4773]: I0120 18:42:56.675561 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" event={"ID":"431c5397-9244-4083-9659-59210fd6d5c0","Type":"ContainerStarted","Data":"7661862e1288a88888ee9336fc2c0afe4f6c138e7cbb9fde241f41a6cbc23529"} Jan 20 18:42:56 crc kubenswrapper[4773]: I0120 18:42:56.676548 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" event={"ID":"9380b21a-b971-4bb9-9572-d795f171b941","Type":"ContainerStarted","Data":"dec075386fc9aa0b4e343e74f81a008446f76bca5ac0548931ae79c2116d405d"} Jan 20 18:42:56 crc kubenswrapper[4773]: I0120 18:42:56.677725 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d5d894b64-qtlgh" event={"ID":"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d","Type":"ContainerStarted","Data":"bf8b13a2dc90d5418f5bbd47b5643a7b8a0a1d289365d0111336cb3e921adf0d"} Jan 20 18:42:56 crc kubenswrapper[4773]: I0120 18:42:56.678861 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" event={"ID":"42822a21-0834-4fc7-aab5-4dcdf46f2786","Type":"ContainerStarted","Data":"6c2149ba53eca7d401577818bf45432410d0c9927772528b9057f4e2713ab2a8"} Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.690522 4773 generic.go:334] "Generic (PLEG): container finished" podID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerID="d1df9d70f10270992bed9e5be25ff4a7e9bb7cccf65bb0859efc558329329c47" exitCode=0 Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.690632 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4nm5" event={"ID":"e6959fd3-1296-4909-b0c1-0803e1e7b098","Type":"ContainerDied","Data":"d1df9d70f10270992bed9e5be25ff4a7e9bb7cccf65bb0859efc558329329c47"} Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.694343 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5d5d894b64-qtlgh" event={"ID":"e2c9dff2-2e8c-4b93-8bd0-b9eacce8ac7d","Type":"ContainerStarted","Data":"603f7f7853baaee0e82b5c825ecc6d119dc16435e5e1f7e5cd8652f387eca8f9"} Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.717743 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5d5d894b64-qtlgh" podStartSLOduration=2.717724844 podStartE2EDuration="2.717724844s" podCreationTimestamp="2026-01-20 18:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:42:57.71502021 +0000 UTC m=+770.636833254" watchObservedRunningTime="2026-01-20 18:42:57.717724844 +0000 UTC m=+770.639537868" Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.773827 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.884343 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsnqc\" (UniqueName: \"kubernetes.io/projected/e6959fd3-1296-4909-b0c1-0803e1e7b098-kube-api-access-bsnqc\") pod \"e6959fd3-1296-4909-b0c1-0803e1e7b098\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.884482 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-utilities\") pod \"e6959fd3-1296-4909-b0c1-0803e1e7b098\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.884546 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-catalog-content\") pod \"e6959fd3-1296-4909-b0c1-0803e1e7b098\" (UID: \"e6959fd3-1296-4909-b0c1-0803e1e7b098\") " Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.885452 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-utilities" (OuterVolumeSpecName: "utilities") pod "e6959fd3-1296-4909-b0c1-0803e1e7b098" (UID: "e6959fd3-1296-4909-b0c1-0803e1e7b098"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.888808 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6959fd3-1296-4909-b0c1-0803e1e7b098-kube-api-access-bsnqc" (OuterVolumeSpecName: "kube-api-access-bsnqc") pod "e6959fd3-1296-4909-b0c1-0803e1e7b098" (UID: "e6959fd3-1296-4909-b0c1-0803e1e7b098"). InnerVolumeSpecName "kube-api-access-bsnqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.985615 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsnqc\" (UniqueName: \"kubernetes.io/projected/e6959fd3-1296-4909-b0c1-0803e1e7b098-kube-api-access-bsnqc\") on node \"crc\" DevicePath \"\"" Jan 20 18:42:57 crc kubenswrapper[4773]: I0120 18:42:57.985650 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.030247 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e6959fd3-1296-4909-b0c1-0803e1e7b098" (UID: "e6959fd3-1296-4909-b0c1-0803e1e7b098"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.086626 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e6959fd3-1296-4909-b0c1-0803e1e7b098-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.170278 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.170345 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.702097 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-n4nm5" Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.702085 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-n4nm5" event={"ID":"e6959fd3-1296-4909-b0c1-0803e1e7b098","Type":"ContainerDied","Data":"e9481dd1fada9d39b60204838bac0a2b76e59756c820f74b2a680486ffce533a"} Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.702512 4773 scope.go:117] "RemoveContainer" containerID="d1df9d70f10270992bed9e5be25ff4a7e9bb7cccf65bb0859efc558329329c47" Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.734565 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-n4nm5"] Jan 20 18:42:58 crc kubenswrapper[4773]: I0120 18:42:58.741072 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-n4nm5"] Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.046737 4773 scope.go:117] "RemoveContainer" containerID="5e754401cfd5accf3b020f80444b146055208f2c923e46a1f3deb9641bd7242a" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.070179 4773 scope.go:117] "RemoveContainer" containerID="db6475baee2b330e6cee2961ede1d25c44221d7af154400d2652eb5c6d58e558" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.457086 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" path="/var/lib/kubelet/pods/e6959fd3-1296-4909-b0c1-0803e1e7b098/volumes" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.710719 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" event={"ID":"9380b21a-b971-4bb9-9572-d795f171b941","Type":"ContainerStarted","Data":"3544b02447bb0e3a305cebd9506121e0a13091893d90d5bc64937d5a0a3db6f3"} Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.711542 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.712709 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-q9h42" event={"ID":"ef435627-8918-4451-8d3a-23e494e29f56","Type":"ContainerStarted","Data":"551b3190f510a02c1f4d02f56e2f205e22a01e44f6c550b1d943e7464107671c"} Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.713264 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.714900 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" event={"ID":"42822a21-0834-4fc7-aab5-4dcdf46f2786","Type":"ContainerStarted","Data":"22230080d84ea1a9bcc5237f0dacc857101432fdf7998c686a1871e0552adc0a"} Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.715921 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" event={"ID":"431c5397-9244-4083-9659-59210fd6d5c0","Type":"ContainerStarted","Data":"9c854106b14595e79d5cf713fe491262958c607c17153851a4b5f877fdcb1bc2"} Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.735523 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" podStartSLOduration=1.608427279 podStartE2EDuration="4.735507906s" podCreationTimestamp="2026-01-20 18:42:55 +0000 UTC" firstStartedPulling="2026-01-20 18:42:55.988254287 +0000 UTC m=+768.910067311" lastFinishedPulling="2026-01-20 18:42:59.115334914 +0000 UTC m=+772.037147938" observedRunningTime="2026-01-20 18:42:59.726829479 +0000 UTC m=+772.648642503" watchObservedRunningTime="2026-01-20 18:42:59.735507906 +0000 UTC m=+772.657320930" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.744073 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-q9h42" podStartSLOduration=1.254455136 podStartE2EDuration="4.74405208s" podCreationTimestamp="2026-01-20 18:42:55 +0000 UTC" firstStartedPulling="2026-01-20 18:42:55.561864064 +0000 UTC m=+768.483677078" lastFinishedPulling="2026-01-20 18:42:59.051461008 +0000 UTC m=+771.973274022" observedRunningTime="2026-01-20 18:42:59.742958974 +0000 UTC m=+772.664771998" watchObservedRunningTime="2026-01-20 18:42:59.74405208 +0000 UTC m=+772.665865104" Jan 20 18:42:59 crc kubenswrapper[4773]: I0120 18:42:59.756760 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-7754f76f8b-z2s7z" podStartSLOduration=1.5474426829999999 podStartE2EDuration="4.756745093s" podCreationTimestamp="2026-01-20 18:42:55 +0000 UTC" firstStartedPulling="2026-01-20 18:42:55.84053177 +0000 UTC m=+768.762344794" lastFinishedPulling="2026-01-20 18:42:59.04983418 +0000 UTC m=+771.971647204" observedRunningTime="2026-01-20 18:42:59.755147215 +0000 UTC m=+772.676960229" watchObservedRunningTime="2026-01-20 18:42:59.756745093 +0000 UTC m=+772.678558117" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.011481 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sxf5z"] Jan 20 18:43:00 crc kubenswrapper[4773]: E0120 18:43:00.012773 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="extract-utilities" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.012910 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="extract-utilities" Jan 20 18:43:00 crc kubenswrapper[4773]: E0120 18:43:00.013062 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="extract-content" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.013073 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="extract-content" Jan 20 18:43:00 crc kubenswrapper[4773]: E0120 18:43:00.013084 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="registry-server" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.013367 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="registry-server" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.013736 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6959fd3-1296-4909-b0c1-0803e1e7b098" containerName="registry-server" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.014921 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.026394 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxf5z"] Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.124307 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-utilities\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.124391 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-catalog-content\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.124552 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp897\" (UniqueName: \"kubernetes.io/projected/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-kube-api-access-kp897\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.225945 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-catalog-content\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.225997 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp897\" (UniqueName: \"kubernetes.io/projected/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-kube-api-access-kp897\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.226027 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-utilities\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.226446 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-utilities\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.226515 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-catalog-content\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.244827 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp897\" (UniqueName: \"kubernetes.io/projected/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-kube-api-access-kp897\") pod \"certified-operators-sxf5z\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.341340 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:00 crc kubenswrapper[4773]: I0120 18:43:00.877399 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sxf5z"] Jan 20 18:43:01 crc kubenswrapper[4773]: I0120 18:43:01.732034 4773 generic.go:334] "Generic (PLEG): container finished" podID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerID="5edb3aa68f1546b3903d52b33cf75034d560372134352c11c040ad6717b2a19c" exitCode=0 Jan 20 18:43:01 crc kubenswrapper[4773]: I0120 18:43:01.732198 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxf5z" event={"ID":"4c2883aa-bc8e-4893-807a-f32cbd1ff77d","Type":"ContainerDied","Data":"5edb3aa68f1546b3903d52b33cf75034d560372134352c11c040ad6717b2a19c"} Jan 20 18:43:01 crc kubenswrapper[4773]: I0120 18:43:01.732620 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxf5z" event={"ID":"4c2883aa-bc8e-4893-807a-f32cbd1ff77d","Type":"ContainerStarted","Data":"ac9a6107ce857d0ef38abc81812d1f9b30bf91a78fba8fb115e05a1f967c57f9"} Jan 20 18:43:02 crc kubenswrapper[4773]: I0120 18:43:02.740652 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" event={"ID":"42822a21-0834-4fc7-aab5-4dcdf46f2786","Type":"ContainerStarted","Data":"1c83f60df914c0960493a33cc8925d97dbd3fc76f7c8f08bb6e2882f04f67b64"} Jan 20 18:43:02 crc kubenswrapper[4773]: I0120 18:43:02.743513 4773 generic.go:334] "Generic (PLEG): container finished" podID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerID="929dc14101950cbbe51e62cda565c93d871b4b5517fe1e6c6ee5b30d456b3539" exitCode=0 Jan 20 18:43:02 crc kubenswrapper[4773]: I0120 18:43:02.743549 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxf5z" event={"ID":"4c2883aa-bc8e-4893-807a-f32cbd1ff77d","Type":"ContainerDied","Data":"929dc14101950cbbe51e62cda565c93d871b4b5517fe1e6c6ee5b30d456b3539"} Jan 20 18:43:02 crc kubenswrapper[4773]: I0120 18:43:02.760382 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-54757c584b-kxbnw" podStartSLOduration=2.067545015 podStartE2EDuration="7.760364031s" podCreationTimestamp="2026-01-20 18:42:55 +0000 UTC" firstStartedPulling="2026-01-20 18:42:55.931489282 +0000 UTC m=+768.853302306" lastFinishedPulling="2026-01-20 18:43:01.624308298 +0000 UTC m=+774.546121322" observedRunningTime="2026-01-20 18:43:02.755360272 +0000 UTC m=+775.677173296" watchObservedRunningTime="2026-01-20 18:43:02.760364031 +0000 UTC m=+775.682177045" Jan 20 18:43:03 crc kubenswrapper[4773]: I0120 18:43:03.749621 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxf5z" event={"ID":"4c2883aa-bc8e-4893-807a-f32cbd1ff77d","Type":"ContainerStarted","Data":"c69894290dc792c55a1de66d7c77f1d88173d270eb044dfd4fd8da2469b7ad97"} Jan 20 18:43:03 crc kubenswrapper[4773]: I0120 18:43:03.765879 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sxf5z" podStartSLOduration=3.320281419 podStartE2EDuration="4.765856956s" podCreationTimestamp="2026-01-20 18:42:59 +0000 UTC" firstStartedPulling="2026-01-20 18:43:01.734465448 +0000 UTC m=+774.656278472" lastFinishedPulling="2026-01-20 18:43:03.180040985 +0000 UTC m=+776.101854009" observedRunningTime="2026-01-20 18:43:03.764134946 +0000 UTC m=+776.685947960" watchObservedRunningTime="2026-01-20 18:43:03.765856956 +0000 UTC m=+776.687669980" Jan 20 18:43:05 crc kubenswrapper[4773]: I0120 18:43:05.551013 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-q9h42" Jan 20 18:43:05 crc kubenswrapper[4773]: I0120 18:43:05.848100 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:43:05 crc kubenswrapper[4773]: I0120 18:43:05.848146 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:43:05 crc kubenswrapper[4773]: I0120 18:43:05.852099 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:43:06 crc kubenswrapper[4773]: I0120 18:43:06.766618 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5d5d894b64-qtlgh" Jan 20 18:43:06 crc kubenswrapper[4773]: I0120 18:43:06.813979 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9nh6h"] Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.238964 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2hz4f"] Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.239970 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.251038 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hz4f"] Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.251926 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-utilities\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.251996 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-catalog-content\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.252065 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r2vq\" (UniqueName: \"kubernetes.io/projected/cc61da58-a0a9-4c56-bd9f-d84ac0474556-kube-api-access-9r2vq\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.353034 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-utilities\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.353121 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-catalog-content\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.353194 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r2vq\" (UniqueName: \"kubernetes.io/projected/cc61da58-a0a9-4c56-bd9f-d84ac0474556-kube-api-access-9r2vq\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.353567 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-utilities\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.353615 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-catalog-content\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.373200 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r2vq\" (UniqueName: \"kubernetes.io/projected/cc61da58-a0a9-4c56-bd9f-d84ac0474556-kube-api-access-9r2vq\") pod \"redhat-marketplace-2hz4f\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.560156 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:07 crc kubenswrapper[4773]: I0120 18:43:07.945328 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hz4f"] Jan 20 18:43:07 crc kubenswrapper[4773]: W0120 18:43:07.953622 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc61da58_a0a9_4c56_bd9f_d84ac0474556.slice/crio-267dd5bb405fcce13b8b28256adffec7676556f07e09a1f05c7570efe42272ea WatchSource:0}: Error finding container 267dd5bb405fcce13b8b28256adffec7676556f07e09a1f05c7570efe42272ea: Status 404 returned error can't find the container with id 267dd5bb405fcce13b8b28256adffec7676556f07e09a1f05c7570efe42272ea Jan 20 18:43:08 crc kubenswrapper[4773]: I0120 18:43:08.775071 4773 generic.go:334] "Generic (PLEG): container finished" podID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerID="b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17" exitCode=0 Jan 20 18:43:08 crc kubenswrapper[4773]: I0120 18:43:08.775126 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hz4f" event={"ID":"cc61da58-a0a9-4c56-bd9f-d84ac0474556","Type":"ContainerDied","Data":"b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17"} Jan 20 18:43:08 crc kubenswrapper[4773]: I0120 18:43:08.775156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hz4f" event={"ID":"cc61da58-a0a9-4c56-bd9f-d84ac0474556","Type":"ContainerStarted","Data":"267dd5bb405fcce13b8b28256adffec7676556f07e09a1f05c7570efe42272ea"} Jan 20 18:43:09 crc kubenswrapper[4773]: I0120 18:43:09.781782 4773 generic.go:334] "Generic (PLEG): container finished" podID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerID="a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7" exitCode=0 Jan 20 18:43:09 crc kubenswrapper[4773]: I0120 18:43:09.782059 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hz4f" event={"ID":"cc61da58-a0a9-4c56-bd9f-d84ac0474556","Type":"ContainerDied","Data":"a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7"} Jan 20 18:43:10 crc kubenswrapper[4773]: I0120 18:43:10.341486 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:10 crc kubenswrapper[4773]: I0120 18:43:10.341722 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:10 crc kubenswrapper[4773]: I0120 18:43:10.389459 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:10 crc kubenswrapper[4773]: I0120 18:43:10.790595 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hz4f" event={"ID":"cc61da58-a0a9-4c56-bd9f-d84ac0474556","Type":"ContainerStarted","Data":"daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b"} Jan 20 18:43:10 crc kubenswrapper[4773]: I0120 18:43:10.816494 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2hz4f" podStartSLOduration=2.4261868079999998 podStartE2EDuration="3.816472593s" podCreationTimestamp="2026-01-20 18:43:07 +0000 UTC" firstStartedPulling="2026-01-20 18:43:08.777410353 +0000 UTC m=+781.699223377" lastFinishedPulling="2026-01-20 18:43:10.167696128 +0000 UTC m=+783.089509162" observedRunningTime="2026-01-20 18:43:10.815414007 +0000 UTC m=+783.737227041" watchObservedRunningTime="2026-01-20 18:43:10.816472593 +0000 UTC m=+783.738285617" Jan 20 18:43:10 crc kubenswrapper[4773]: I0120 18:43:10.841794 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:12 crc kubenswrapper[4773]: I0120 18:43:12.622647 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxf5z"] Jan 20 18:43:12 crc kubenswrapper[4773]: I0120 18:43:12.802043 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sxf5z" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="registry-server" containerID="cri-o://c69894290dc792c55a1de66d7c77f1d88173d270eb044dfd4fd8da2469b7ad97" gracePeriod=2 Jan 20 18:43:13 crc kubenswrapper[4773]: I0120 18:43:13.808636 4773 generic.go:334] "Generic (PLEG): container finished" podID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerID="c69894290dc792c55a1de66d7c77f1d88173d270eb044dfd4fd8da2469b7ad97" exitCode=0 Jan 20 18:43:13 crc kubenswrapper[4773]: I0120 18:43:13.808727 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxf5z" event={"ID":"4c2883aa-bc8e-4893-807a-f32cbd1ff77d","Type":"ContainerDied","Data":"c69894290dc792c55a1de66d7c77f1d88173d270eb044dfd4fd8da2469b7ad97"} Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.288486 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.448476 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp897\" (UniqueName: \"kubernetes.io/projected/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-kube-api-access-kp897\") pod \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.448560 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-catalog-content\") pod \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.448622 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-utilities\") pod \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\" (UID: \"4c2883aa-bc8e-4893-807a-f32cbd1ff77d\") " Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.449565 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-utilities" (OuterVolumeSpecName: "utilities") pod "4c2883aa-bc8e-4893-807a-f32cbd1ff77d" (UID: "4c2883aa-bc8e-4893-807a-f32cbd1ff77d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.454114 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-kube-api-access-kp897" (OuterVolumeSpecName: "kube-api-access-kp897") pod "4c2883aa-bc8e-4893-807a-f32cbd1ff77d" (UID: "4c2883aa-bc8e-4893-807a-f32cbd1ff77d"). InnerVolumeSpecName "kube-api-access-kp897". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.490681 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4c2883aa-bc8e-4893-807a-f32cbd1ff77d" (UID: "4c2883aa-bc8e-4893-807a-f32cbd1ff77d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.551017 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.551307 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.551317 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp897\" (UniqueName: \"kubernetes.io/projected/4c2883aa-bc8e-4893-807a-f32cbd1ff77d-kube-api-access-kp897\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.819435 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sxf5z" event={"ID":"4c2883aa-bc8e-4893-807a-f32cbd1ff77d","Type":"ContainerDied","Data":"ac9a6107ce857d0ef38abc81812d1f9b30bf91a78fba8fb115e05a1f967c57f9"} Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.819515 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sxf5z" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.820788 4773 scope.go:117] "RemoveContainer" containerID="c69894290dc792c55a1de66d7c77f1d88173d270eb044dfd4fd8da2469b7ad97" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.839612 4773 scope.go:117] "RemoveContainer" containerID="929dc14101950cbbe51e62cda565c93d871b4b5517fe1e6c6ee5b30d456b3539" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.850637 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sxf5z"] Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.856638 4773 scope.go:117] "RemoveContainer" containerID="5edb3aa68f1546b3903d52b33cf75034d560372134352c11c040ad6717b2a19c" Jan 20 18:43:14 crc kubenswrapper[4773]: I0120 18:43:14.857134 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sxf5z"] Jan 20 18:43:15 crc kubenswrapper[4773]: I0120 18:43:15.462750 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" path="/var/lib/kubelet/pods/4c2883aa-bc8e-4893-807a-f32cbd1ff77d/volumes" Jan 20 18:43:15 crc kubenswrapper[4773]: I0120 18:43:15.531250 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-8474b5b9d8-8q4jc" Jan 20 18:43:17 crc kubenswrapper[4773]: I0120 18:43:17.561137 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:17 crc kubenswrapper[4773]: I0120 18:43:17.561558 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:17 crc kubenswrapper[4773]: I0120 18:43:17.604428 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:17 crc kubenswrapper[4773]: I0120 18:43:17.875493 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:18 crc kubenswrapper[4773]: I0120 18:43:18.822220 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hz4f"] Jan 20 18:43:19 crc kubenswrapper[4773]: I0120 18:43:19.850014 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2hz4f" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="registry-server" containerID="cri-o://daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b" gracePeriod=2 Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.719969 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.801333 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-utilities\") pod \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.801388 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-catalog-content\") pod \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.801443 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r2vq\" (UniqueName: \"kubernetes.io/projected/cc61da58-a0a9-4c56-bd9f-d84ac0474556-kube-api-access-9r2vq\") pod \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\" (UID: \"cc61da58-a0a9-4c56-bd9f-d84ac0474556\") " Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.802158 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-utilities" (OuterVolumeSpecName: "utilities") pod "cc61da58-a0a9-4c56-bd9f-d84ac0474556" (UID: "cc61da58-a0a9-4c56-bd9f-d84ac0474556"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.806702 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc61da58-a0a9-4c56-bd9f-d84ac0474556-kube-api-access-9r2vq" (OuterVolumeSpecName: "kube-api-access-9r2vq") pod "cc61da58-a0a9-4c56-bd9f-d84ac0474556" (UID: "cc61da58-a0a9-4c56-bd9f-d84ac0474556"). InnerVolumeSpecName "kube-api-access-9r2vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.825320 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "cc61da58-a0a9-4c56-bd9f-d84ac0474556" (UID: "cc61da58-a0a9-4c56-bd9f-d84ac0474556"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.858366 4773 generic.go:334] "Generic (PLEG): container finished" podID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerID="daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b" exitCode=0 Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.858448 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hz4f" event={"ID":"cc61da58-a0a9-4c56-bd9f-d84ac0474556","Type":"ContainerDied","Data":"daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b"} Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.858493 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2hz4f" event={"ID":"cc61da58-a0a9-4c56-bd9f-d84ac0474556","Type":"ContainerDied","Data":"267dd5bb405fcce13b8b28256adffec7676556f07e09a1f05c7570efe42272ea"} Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.858521 4773 scope.go:117] "RemoveContainer" containerID="daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.858750 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2hz4f" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.886137 4773 scope.go:117] "RemoveContainer" containerID="a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.902589 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r2vq\" (UniqueName: \"kubernetes.io/projected/cc61da58-a0a9-4c56-bd9f-d84ac0474556-kube-api-access-9r2vq\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.902674 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.902695 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/cc61da58-a0a9-4c56-bd9f-d84ac0474556-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.903976 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hz4f"] Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.909971 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2hz4f"] Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.919214 4773 scope.go:117] "RemoveContainer" containerID="b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.936611 4773 scope.go:117] "RemoveContainer" containerID="daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b" Jan 20 18:43:20 crc kubenswrapper[4773]: E0120 18:43:20.937074 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b\": container with ID starting with daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b not found: ID does not exist" containerID="daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.937110 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b"} err="failed to get container status \"daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b\": rpc error: code = NotFound desc = could not find container \"daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b\": container with ID starting with daf455ce32f7a7bcf421f22fb8673a007c84905032baa57de7085577f9ec836b not found: ID does not exist" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.937136 4773 scope.go:117] "RemoveContainer" containerID="a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7" Jan 20 18:43:20 crc kubenswrapper[4773]: E0120 18:43:20.937525 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7\": container with ID starting with a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7 not found: ID does not exist" containerID="a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.937548 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7"} err="failed to get container status \"a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7\": rpc error: code = NotFound desc = could not find container \"a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7\": container with ID starting with a69eec688eb0c9c8472c543f2fc04666bcb116084d3753a04d88806bfaecffb7 not found: ID does not exist" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.937566 4773 scope.go:117] "RemoveContainer" containerID="b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17" Jan 20 18:43:20 crc kubenswrapper[4773]: E0120 18:43:20.938038 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17\": container with ID starting with b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17 not found: ID does not exist" containerID="b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17" Jan 20 18:43:20 crc kubenswrapper[4773]: I0120 18:43:20.938126 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17"} err="failed to get container status \"b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17\": rpc error: code = NotFound desc = could not find container \"b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17\": container with ID starting with b3aa0afd8b11fc654ac7e463b11879cdb4c04615b668296d4a91a999d82d7a17 not found: ID does not exist" Jan 20 18:43:21 crc kubenswrapper[4773]: I0120 18:43:21.457511 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" path="/var/lib/kubelet/pods/cc61da58-a0a9-4c56-bd9f-d84ac0474556/volumes" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.170192 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.170775 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.170821 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.171373 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"714571a77485b95d4127b785d445e091e7d21c1d67336a6816b862641584bfce"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.171421 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://714571a77485b95d4127b785d445e091e7d21c1d67336a6816b862641584bfce" gracePeriod=600 Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.832227 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g"] Jan 20 18:43:28 crc kubenswrapper[4773]: E0120 18:43:28.832967 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="extract-content" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.832983 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="extract-content" Jan 20 18:43:28 crc kubenswrapper[4773]: E0120 18:43:28.832992 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="registry-server" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.832999 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="registry-server" Jan 20 18:43:28 crc kubenswrapper[4773]: E0120 18:43:28.833007 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="extract-utilities" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.833014 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="extract-utilities" Jan 20 18:43:28 crc kubenswrapper[4773]: E0120 18:43:28.833029 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="registry-server" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.833035 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="registry-server" Jan 20 18:43:28 crc kubenswrapper[4773]: E0120 18:43:28.833047 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="extract-content" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.833054 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="extract-content" Jan 20 18:43:28 crc kubenswrapper[4773]: E0120 18:43:28.833067 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="extract-utilities" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.833073 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="extract-utilities" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.833190 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc61da58-a0a9-4c56-bd9f-d84ac0474556" containerName="registry-server" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.833201 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c2883aa-bc8e-4893-807a-f32cbd1ff77d" containerName="registry-server" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.834039 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.838469 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.842258 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g"] Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.910048 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="714571a77485b95d4127b785d445e091e7d21c1d67336a6816b862641584bfce" exitCode=0 Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.910091 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"714571a77485b95d4127b785d445e091e7d21c1d67336a6816b862641584bfce"} Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.910463 4773 scope.go:117] "RemoveContainer" containerID="8ab3757bc284c8c9f1f813b678bd9bbed50bf491e47d29da19e04261db6c0c92" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.910914 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"89086664f3aacadd154bb5a0e821ec93e674c41f0d2d3c8a5f423e5e3f0c2f57"} Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.917490 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hbmpx\" (UniqueName: \"kubernetes.io/projected/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-kube-api-access-hbmpx\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.917612 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:28 crc kubenswrapper[4773]: I0120 18:43:28.917960 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.019023 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hbmpx\" (UniqueName: \"kubernetes.io/projected/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-kube-api-access-hbmpx\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.019130 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.019176 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.019751 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.019836 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.047872 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hbmpx\" (UniqueName: \"kubernetes.io/projected/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-kube-api-access-hbmpx\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.150625 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.369470 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g"] Jan 20 18:43:29 crc kubenswrapper[4773]: W0120 18:43:29.375023 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e0b8536_2fc2_4203_a22f_a2dc29d0b737.slice/crio-f93fef1760924c33c57ea61008cf363cfa8ad9ba6ee9ca6195041959d2bec8dd WatchSource:0}: Error finding container f93fef1760924c33c57ea61008cf363cfa8ad9ba6ee9ca6195041959d2bec8dd: Status 404 returned error can't find the container with id f93fef1760924c33c57ea61008cf363cfa8ad9ba6ee9ca6195041959d2bec8dd Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.922500 4773 generic.go:334] "Generic (PLEG): container finished" podID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerID="b8c7934e89e94988867800ba6aa7353813716b12795088fc773b31014106afe8" exitCode=0 Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.922909 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" event={"ID":"8e0b8536-2fc2-4203-a22f-a2dc29d0b737","Type":"ContainerDied","Data":"b8c7934e89e94988867800ba6aa7353813716b12795088fc773b31014106afe8"} Jan 20 18:43:29 crc kubenswrapper[4773]: I0120 18:43:29.922994 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" event={"ID":"8e0b8536-2fc2-4203-a22f-a2dc29d0b737","Type":"ContainerStarted","Data":"f93fef1760924c33c57ea61008cf363cfa8ad9ba6ee9ca6195041959d2bec8dd"} Jan 20 18:43:31 crc kubenswrapper[4773]: I0120 18:43:31.882391 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-9nh6h" podUID="ba3736bb-3d36-4a0c-91fa-85f410849312" containerName="console" containerID="cri-o://9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367" gracePeriod=15 Jan 20 18:43:31 crc kubenswrapper[4773]: I0120 18:43:31.935230 4773 generic.go:334] "Generic (PLEG): container finished" podID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerID="34f15b6ad0b93f7ef4e8ebecf6f99e7748713602c43c29180183e7a27121e6af" exitCode=0 Jan 20 18:43:31 crc kubenswrapper[4773]: I0120 18:43:31.935277 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" event={"ID":"8e0b8536-2fc2-4203-a22f-a2dc29d0b737","Type":"ContainerDied","Data":"34f15b6ad0b93f7ef4e8ebecf6f99e7748713602c43c29180183e7a27121e6af"} Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.269811 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9nh6h_ba3736bb-3d36-4a0c-91fa-85f410849312/console/0.log" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.270409 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.462450 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-service-ca\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.462504 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-serving-cert\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.462547 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ddbv\" (UniqueName: \"kubernetes.io/projected/ba3736bb-3d36-4a0c-91fa-85f410849312-kube-api-access-8ddbv\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.462888 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-console-config\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463025 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-trusted-ca-bundle\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463079 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-oauth-config\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463156 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-oauth-serving-cert\") pod \"ba3736bb-3d36-4a0c-91fa-85f410849312\" (UID: \"ba3736bb-3d36-4a0c-91fa-85f410849312\") " Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463668 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463758 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-service-ca" (OuterVolumeSpecName: "service-ca") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463800 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-console-config" (OuterVolumeSpecName: "console-config") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.463756 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.468813 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.469248 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.469291 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba3736bb-3d36-4a0c-91fa-85f410849312-kube-api-access-8ddbv" (OuterVolumeSpecName: "kube-api-access-8ddbv") pod "ba3736bb-3d36-4a0c-91fa-85f410849312" (UID: "ba3736bb-3d36-4a0c-91fa-85f410849312"). InnerVolumeSpecName "kube-api-access-8ddbv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565125 4773 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-service-ca\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565204 4773 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565232 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ddbv\" (UniqueName: \"kubernetes.io/projected/ba3736bb-3d36-4a0c-91fa-85f410849312-kube-api-access-8ddbv\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565246 4773 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-console-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565260 4773 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565273 4773 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ba3736bb-3d36-4a0c-91fa-85f410849312-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.565286 4773 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ba3736bb-3d36-4a0c-91fa-85f410849312-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.944360 4773 generic.go:334] "Generic (PLEG): container finished" podID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerID="44de06d7de234234dff221a8c874efef7482abe3382b295c53f3bd9b7efd8e03" exitCode=0 Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.944434 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" event={"ID":"8e0b8536-2fc2-4203-a22f-a2dc29d0b737","Type":"ContainerDied","Data":"44de06d7de234234dff221a8c874efef7482abe3382b295c53f3bd9b7efd8e03"} Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.946370 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-9nh6h_ba3736bb-3d36-4a0c-91fa-85f410849312/console/0.log" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.946400 4773 generic.go:334] "Generic (PLEG): container finished" podID="ba3736bb-3d36-4a0c-91fa-85f410849312" containerID="9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367" exitCode=2 Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.946424 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9nh6h" event={"ID":"ba3736bb-3d36-4a0c-91fa-85f410849312","Type":"ContainerDied","Data":"9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367"} Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.946448 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-9nh6h" event={"ID":"ba3736bb-3d36-4a0c-91fa-85f410849312","Type":"ContainerDied","Data":"e7d6c34d2f903961f01ee5fdb975fd359e3c9fc20a9e900b7c3df54dd10fd2d7"} Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.946465 4773 scope.go:117] "RemoveContainer" containerID="9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.946543 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-9nh6h" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.977295 4773 scope.go:117] "RemoveContainer" containerID="9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367" Jan 20 18:43:32 crc kubenswrapper[4773]: E0120 18:43:32.977907 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367\": container with ID starting with 9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367 not found: ID does not exist" containerID="9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.978055 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367"} err="failed to get container status \"9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367\": rpc error: code = NotFound desc = could not find container \"9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367\": container with ID starting with 9a1c0fbbefd03d4c820bbc1556885265bb7e47ee9bb1d58ad84f3c53b88d3367 not found: ID does not exist" Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.988080 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-9nh6h"] Jan 20 18:43:32 crc kubenswrapper[4773]: I0120 18:43:32.992505 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-9nh6h"] Jan 20 18:43:33 crc kubenswrapper[4773]: I0120 18:43:33.455348 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba3736bb-3d36-4a0c-91fa-85f410849312" path="/var/lib/kubelet/pods/ba3736bb-3d36-4a0c-91fa-85f410849312/volumes" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.161205 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.285839 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-bundle\") pod \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.286253 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hbmpx\" (UniqueName: \"kubernetes.io/projected/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-kube-api-access-hbmpx\") pod \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.286337 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-util\") pod \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\" (UID: \"8e0b8536-2fc2-4203-a22f-a2dc29d0b737\") " Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.286948 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-bundle" (OuterVolumeSpecName: "bundle") pod "8e0b8536-2fc2-4203-a22f-a2dc29d0b737" (UID: "8e0b8536-2fc2-4203-a22f-a2dc29d0b737"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.290426 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-kube-api-access-hbmpx" (OuterVolumeSpecName: "kube-api-access-hbmpx") pod "8e0b8536-2fc2-4203-a22f-a2dc29d0b737" (UID: "8e0b8536-2fc2-4203-a22f-a2dc29d0b737"). InnerVolumeSpecName "kube-api-access-hbmpx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.300857 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-util" (OuterVolumeSpecName: "util") pod "8e0b8536-2fc2-4203-a22f-a2dc29d0b737" (UID: "8e0b8536-2fc2-4203-a22f-a2dc29d0b737"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.386967 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.387229 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hbmpx\" (UniqueName: \"kubernetes.io/projected/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-kube-api-access-hbmpx\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.387240 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8e0b8536-2fc2-4203-a22f-a2dc29d0b737-util\") on node \"crc\" DevicePath \"\"" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.965434 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" event={"ID":"8e0b8536-2fc2-4203-a22f-a2dc29d0b737","Type":"ContainerDied","Data":"f93fef1760924c33c57ea61008cf363cfa8ad9ba6ee9ca6195041959d2bec8dd"} Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.965721 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f93fef1760924c33c57ea61008cf363cfa8ad9ba6ee9ca6195041959d2bec8dd" Jan 20 18:43:34 crc kubenswrapper[4773]: I0120 18:43:34.965561 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.235582 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm"] Jan 20 18:43:44 crc kubenswrapper[4773]: E0120 18:43:44.236181 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="util" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236196 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="util" Jan 20 18:43:44 crc kubenswrapper[4773]: E0120 18:43:44.236211 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="pull" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236218 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="pull" Jan 20 18:43:44 crc kubenswrapper[4773]: E0120 18:43:44.236235 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="extract" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236243 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="extract" Jan 20 18:43:44 crc kubenswrapper[4773]: E0120 18:43:44.236253 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba3736bb-3d36-4a0c-91fa-85f410849312" containerName="console" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236260 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba3736bb-3d36-4a0c-91fa-85f410849312" containerName="console" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236386 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e0b8536-2fc2-4203-a22f-a2dc29d0b737" containerName="extract" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236396 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba3736bb-3d36-4a0c-91fa-85f410849312" containerName="console" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.236855 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.238581 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.238615 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-qnrhn" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.238860 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.239298 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.247702 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.249612 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm"] Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.265832 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a93bbf26-2683-4cf0-a45a-1639d6da4e01-apiservice-cert\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.265906 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26b2f\" (UniqueName: \"kubernetes.io/projected/a93bbf26-2683-4cf0-a45a-1639d6da4e01-kube-api-access-26b2f\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.266025 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a93bbf26-2683-4cf0-a45a-1639d6da4e01-webhook-cert\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.366693 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26b2f\" (UniqueName: \"kubernetes.io/projected/a93bbf26-2683-4cf0-a45a-1639d6da4e01-kube-api-access-26b2f\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.366755 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a93bbf26-2683-4cf0-a45a-1639d6da4e01-webhook-cert\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.366835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a93bbf26-2683-4cf0-a45a-1639d6da4e01-apiservice-cert\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.373396 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/a93bbf26-2683-4cf0-a45a-1639d6da4e01-apiservice-cert\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.378632 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a93bbf26-2683-4cf0-a45a-1639d6da4e01-webhook-cert\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.385361 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26b2f\" (UniqueName: \"kubernetes.io/projected/a93bbf26-2683-4cf0-a45a-1639d6da4e01-kube-api-access-26b2f\") pod \"metallb-operator-controller-manager-58b89bff97-7jrvm\" (UID: \"a93bbf26-2683-4cf0-a45a-1639d6da4e01\") " pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.488824 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x"] Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.489565 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.491767 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.491982 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-bp4hd" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.493052 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.504869 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x"] Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.554854 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.671142 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbba9cb2-22ec-4f8c-8550-f3a69901785c-webhook-cert\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.671445 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbba9cb2-22ec-4f8c-8550-f3a69901785c-apiservice-cert\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.671504 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gt5n5\" (UniqueName: \"kubernetes.io/projected/cbba9cb2-22ec-4f8c-8550-f3a69901785c-kube-api-access-gt5n5\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.772394 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbba9cb2-22ec-4f8c-8550-f3a69901785c-webhook-cert\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.772456 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbba9cb2-22ec-4f8c-8550-f3a69901785c-apiservice-cert\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.772490 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gt5n5\" (UniqueName: \"kubernetes.io/projected/cbba9cb2-22ec-4f8c-8550-f3a69901785c-kube-api-access-gt5n5\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.779013 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cbba9cb2-22ec-4f8c-8550-f3a69901785c-webhook-cert\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.789235 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gt5n5\" (UniqueName: \"kubernetes.io/projected/cbba9cb2-22ec-4f8c-8550-f3a69901785c-kube-api-access-gt5n5\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.801957 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cbba9cb2-22ec-4f8c-8550-f3a69901785c-apiservice-cert\") pod \"metallb-operator-webhook-server-6cbdbfd488-hxn9x\" (UID: \"cbba9cb2-22ec-4f8c-8550-f3a69901785c\") " pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:44 crc kubenswrapper[4773]: I0120 18:43:44.813262 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:45 crc kubenswrapper[4773]: I0120 18:43:45.032753 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm"] Jan 20 18:43:45 crc kubenswrapper[4773]: I0120 18:43:45.306694 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x"] Jan 20 18:43:45 crc kubenswrapper[4773]: W0120 18:43:45.309646 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcbba9cb2_22ec_4f8c_8550_f3a69901785c.slice/crio-f2953e773561b266ba7d784c947184c65c5005c8a7ffb0df87c5bc4294bbc72a WatchSource:0}: Error finding container f2953e773561b266ba7d784c947184c65c5005c8a7ffb0df87c5bc4294bbc72a: Status 404 returned error can't find the container with id f2953e773561b266ba7d784c947184c65c5005c8a7ffb0df87c5bc4294bbc72a Jan 20 18:43:46 crc kubenswrapper[4773]: I0120 18:43:46.023259 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" event={"ID":"cbba9cb2-22ec-4f8c-8550-f3a69901785c","Type":"ContainerStarted","Data":"f2953e773561b266ba7d784c947184c65c5005c8a7ffb0df87c5bc4294bbc72a"} Jan 20 18:43:46 crc kubenswrapper[4773]: I0120 18:43:46.024669 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" event={"ID":"a93bbf26-2683-4cf0-a45a-1639d6da4e01","Type":"ContainerStarted","Data":"1ff00c78a12dda61decf1f705cdae7b2aaaf9a2bd9a11bdb90f9209cd788f971"} Jan 20 18:43:50 crc kubenswrapper[4773]: I0120 18:43:50.050695 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" event={"ID":"a93bbf26-2683-4cf0-a45a-1639d6da4e01","Type":"ContainerStarted","Data":"2842fe0fc789e8dc9af0a9e13ee0747e4a399fb4db9f2f5e9372832e1ad2d0d5"} Jan 20 18:43:50 crc kubenswrapper[4773]: I0120 18:43:50.051053 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:43:50 crc kubenswrapper[4773]: I0120 18:43:50.053204 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" event={"ID":"cbba9cb2-22ec-4f8c-8550-f3a69901785c","Type":"ContainerStarted","Data":"38132df97442f0f2cfd9500866f7693c1ed2218bb81bc3780ddda9306fb6d716"} Jan 20 18:43:50 crc kubenswrapper[4773]: I0120 18:43:50.053535 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:43:50 crc kubenswrapper[4773]: I0120 18:43:50.080238 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" podStartSLOduration=1.741248254 podStartE2EDuration="6.080217815s" podCreationTimestamp="2026-01-20 18:43:44 +0000 UTC" firstStartedPulling="2026-01-20 18:43:45.042164387 +0000 UTC m=+817.963977411" lastFinishedPulling="2026-01-20 18:43:49.381133948 +0000 UTC m=+822.302946972" observedRunningTime="2026-01-20 18:43:50.079311003 +0000 UTC m=+823.001124047" watchObservedRunningTime="2026-01-20 18:43:50.080217815 +0000 UTC m=+823.002030849" Jan 20 18:44:04 crc kubenswrapper[4773]: I0120 18:44:04.821311 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" Jan 20 18:44:04 crc kubenswrapper[4773]: I0120 18:44:04.854014 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-6cbdbfd488-hxn9x" podStartSLOduration=16.763260217 podStartE2EDuration="20.853994708s" podCreationTimestamp="2026-01-20 18:43:44 +0000 UTC" firstStartedPulling="2026-01-20 18:43:45.312754591 +0000 UTC m=+818.234567615" lastFinishedPulling="2026-01-20 18:43:49.403489082 +0000 UTC m=+822.325302106" observedRunningTime="2026-01-20 18:43:50.098459901 +0000 UTC m=+823.020272935" watchObservedRunningTime="2026-01-20 18:44:04.853994708 +0000 UTC m=+837.775807732" Jan 20 18:44:24 crc kubenswrapper[4773]: I0120 18:44:24.557357 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-58b89bff97-7jrvm" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.228224 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz"] Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.229417 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.231285 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.233270 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-t5r27" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.239344 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-l2dcx"] Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.280676 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.289193 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.289592 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.339993 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz"] Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.376961 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-7xdr9"] Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.378232 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.388966 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.389397 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.389328 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-stb8g" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.389690 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411425 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f52c7\" (UniqueName: \"kubernetes.io/projected/859ada1b-1a7b-4032-a974-2ec3571aa069-kube-api-access-f52c7\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411474 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg7mt\" (UniqueName: \"kubernetes.io/projected/05a83b70-ac51-4951-92c6-0f90265f2958-kube-api-access-kg7mt\") pod \"frr-k8s-webhook-server-7df86c4f6c-vv9kz\" (UID: \"05a83b70-ac51-4951-92c6-0f90265f2958\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411506 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics-certs\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411529 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-reloader\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411553 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-conf\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411587 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411604 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a83b70-ac51-4951-92c6-0f90265f2958-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-vv9kz\" (UID: \"05a83b70-ac51-4951-92c6-0f90265f2958\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411632 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-sockets\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.411690 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-startup\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.422107 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-2nprh"] Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.425382 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.430183 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.430225 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-2nprh"] Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512484 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512533 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-startup\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f52c7\" (UniqueName: \"kubernetes.io/projected/859ada1b-1a7b-4032-a974-2ec3571aa069-kube-api-access-f52c7\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512576 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg7mt\" (UniqueName: \"kubernetes.io/projected/05a83b70-ac51-4951-92c6-0f90265f2958-kube-api-access-kg7mt\") pod \"frr-k8s-webhook-server-7df86c4f6c-vv9kz\" (UID: \"05a83b70-ac51-4951-92c6-0f90265f2958\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512597 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metallb-excludel2\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512613 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics-certs\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512631 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-reloader\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512649 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metrics-certs\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512666 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-conf\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512690 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.512703 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a83b70-ac51-4951-92c6-0f90265f2958-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-vv9kz\" (UID: \"05a83b70-ac51-4951-92c6-0f90265f2958\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.512822 4773 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.512867 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics-certs podName:859ada1b-1a7b-4032-a974-2ec3571aa069 nodeName:}" failed. No retries permitted until 2026-01-20 18:44:26.012853121 +0000 UTC m=+858.934666145 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics-certs") pod "frr-k8s-l2dcx" (UID: "859ada1b-1a7b-4032-a974-2ec3571aa069") : secret "frr-k8s-certs-secret" not found Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.513441 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.513639 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-startup\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.513659 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-reloader\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.513704 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rprmd\" (UniqueName: \"kubernetes.io/projected/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-kube-api-access-rprmd\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.513761 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-sockets\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.513948 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-sockets\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.514060 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/859ada1b-1a7b-4032-a974-2ec3571aa069-frr-conf\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.520328 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05a83b70-ac51-4951-92c6-0f90265f2958-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-vv9kz\" (UID: \"05a83b70-ac51-4951-92c6-0f90265f2958\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.528874 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg7mt\" (UniqueName: \"kubernetes.io/projected/05a83b70-ac51-4951-92c6-0f90265f2958-kube-api-access-kg7mt\") pod \"frr-k8s-webhook-server-7df86c4f6c-vv9kz\" (UID: \"05a83b70-ac51-4951-92c6-0f90265f2958\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.534758 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f52c7\" (UniqueName: \"kubernetes.io/projected/859ada1b-1a7b-4032-a974-2ec3571aa069-kube-api-access-f52c7\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614399 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614464 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metallb-excludel2\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614519 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metrics-certs\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614554 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-metrics-certs\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614595 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-cert\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614632 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rprmd\" (UniqueName: \"kubernetes.io/projected/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-kube-api-access-rprmd\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.614668 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wdbt\" (UniqueName: \"kubernetes.io/projected/e284563a-e5f1-4c86-8100-863f86a7f7dc-kube-api-access-9wdbt\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.614688 4773 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.614767 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metrics-certs podName:ce74c2a6-61b7-4fb4-883f-e86bf4b5c604 nodeName:}" failed. No retries permitted until 2026-01-20 18:44:26.114747308 +0000 UTC m=+859.036560332 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metrics-certs") pod "speaker-7xdr9" (UID: "ce74c2a6-61b7-4fb4-883f-e86bf4b5c604") : secret "speaker-certs-secret" not found Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.615206 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metallb-excludel2\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.615833 4773 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.616014 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist podName:ce74c2a6-61b7-4fb4-883f-e86bf4b5c604 nodeName:}" failed. No retries permitted until 2026-01-20 18:44:26.115992198 +0000 UTC m=+859.037805222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist") pod "speaker-7xdr9" (UID: "ce74c2a6-61b7-4fb4-883f-e86bf4b5c604") : secret "metallb-memberlist" not found Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.622092 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.637651 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rprmd\" (UniqueName: \"kubernetes.io/projected/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-kube-api-access-rprmd\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.715814 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-metrics-certs\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.716111 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-cert\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.716166 4773 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 20 18:44:25 crc kubenswrapper[4773]: E0120 18:44:25.716254 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-metrics-certs podName:e284563a-e5f1-4c86-8100-863f86a7f7dc nodeName:}" failed. No retries permitted until 2026-01-20 18:44:26.216229926 +0000 UTC m=+859.138043020 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-metrics-certs") pod "controller-6968d8fdc4-2nprh" (UID: "e284563a-e5f1-4c86-8100-863f86a7f7dc") : secret "controller-certs-secret" not found Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.716172 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9wdbt\" (UniqueName: \"kubernetes.io/projected/e284563a-e5f1-4c86-8100-863f86a7f7dc-kube-api-access-9wdbt\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.718459 4773 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.730610 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-cert\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.733101 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wdbt\" (UniqueName: \"kubernetes.io/projected/e284563a-e5f1-4c86-8100-863f86a7f7dc-kube-api-access-9wdbt\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:25 crc kubenswrapper[4773]: I0120 18:44:25.849581 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz"] Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.018696 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics-certs\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.022580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/859ada1b-1a7b-4032-a974-2ec3571aa069-metrics-certs\") pod \"frr-k8s-l2dcx\" (UID: \"859ada1b-1a7b-4032-a974-2ec3571aa069\") " pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.119844 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metrics-certs\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.119987 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:26 crc kubenswrapper[4773]: E0120 18:44:26.120139 4773 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 20 18:44:26 crc kubenswrapper[4773]: E0120 18:44:26.120214 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist podName:ce74c2a6-61b7-4fb4-883f-e86bf4b5c604 nodeName:}" failed. No retries permitted until 2026-01-20 18:44:27.12019657 +0000 UTC m=+860.042009594 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist") pod "speaker-7xdr9" (UID: "ce74c2a6-61b7-4fb4-883f-e86bf4b5c604") : secret "metallb-memberlist" not found Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.124663 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-metrics-certs\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.220942 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-metrics-certs\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.224498 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e284563a-e5f1-4c86-8100-863f86a7f7dc-metrics-certs\") pod \"controller-6968d8fdc4-2nprh\" (UID: \"e284563a-e5f1-4c86-8100-863f86a7f7dc\") " pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.234106 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.290159 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" event={"ID":"05a83b70-ac51-4951-92c6-0f90265f2958","Type":"ContainerStarted","Data":"0e5b53f31f8de0b2dcb77f43d60fa89eba95be443948366d9e2b07bcb018e307"} Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.342812 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:26 crc kubenswrapper[4773]: I0120 18:44:26.524796 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-2nprh"] Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.131132 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.137528 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/ce74c2a6-61b7-4fb4-883f-e86bf4b5c604-memberlist\") pod \"speaker-7xdr9\" (UID: \"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604\") " pod="metallb-system/speaker-7xdr9" Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.212528 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-7xdr9" Jan 20 18:44:27 crc kubenswrapper[4773]: W0120 18:44:27.231088 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce74c2a6_61b7_4fb4_883f_e86bf4b5c604.slice/crio-91e3d6b2d1fc92df071051589f6eaa9bf273cf1c579b4b10e6809db05f044932 WatchSource:0}: Error finding container 91e3d6b2d1fc92df071051589f6eaa9bf273cf1c579b4b10e6809db05f044932: Status 404 returned error can't find the container with id 91e3d6b2d1fc92df071051589f6eaa9bf273cf1c579b4b10e6809db05f044932 Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.298654 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-2nprh" event={"ID":"e284563a-e5f1-4c86-8100-863f86a7f7dc","Type":"ContainerStarted","Data":"ddbbe016a23000596d2f746ae14a3dcee4dd29fa065d50df40ee7dfece2f95b5"} Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.298701 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-2nprh" event={"ID":"e284563a-e5f1-4c86-8100-863f86a7f7dc","Type":"ContainerStarted","Data":"ebbbd6518ebee81cf3355bc6794ea6a10bfa5776db0858b9d7313db90b4ba0fd"} Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.298717 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-2nprh" event={"ID":"e284563a-e5f1-4c86-8100-863f86a7f7dc","Type":"ContainerStarted","Data":"ee8f2a077f8dcbd844971c220ccb97d56367860ed80d8a7f150ee271db5b7ce1"} Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.298754 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.301403 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7xdr9" event={"ID":"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604","Type":"ContainerStarted","Data":"91e3d6b2d1fc92df071051589f6eaa9bf273cf1c579b4b10e6809db05f044932"} Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.303877 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"3341d93471f0c1295966cff7dd30eb227592a98bdc54cfa3029f751141d436b5"} Jan 20 18:44:27 crc kubenswrapper[4773]: I0120 18:44:27.319142 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-2nprh" podStartSLOduration=2.319121045 podStartE2EDuration="2.319121045s" podCreationTimestamp="2026-01-20 18:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:44:27.315505519 +0000 UTC m=+860.237318563" watchObservedRunningTime="2026-01-20 18:44:27.319121045 +0000 UTC m=+860.240934079" Jan 20 18:44:28 crc kubenswrapper[4773]: I0120 18:44:28.317653 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7xdr9" event={"ID":"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604","Type":"ContainerStarted","Data":"ce00d9899cdea324d159fab7c0f43030f425057fdb6604d480c2d38284d8ca51"} Jan 20 18:44:28 crc kubenswrapper[4773]: I0120 18:44:28.318042 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-7xdr9" event={"ID":"ce74c2a6-61b7-4fb4-883f-e86bf4b5c604","Type":"ContainerStarted","Data":"af9d107c6c96c1737482efa3fee7c1f55427c31284f49869482d20bae615152e"} Jan 20 18:44:29 crc kubenswrapper[4773]: I0120 18:44:29.324188 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-7xdr9" Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.346328 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-2nprh" Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.367196 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-7xdr9" podStartSLOduration=11.367167011 podStartE2EDuration="11.367167011s" podCreationTimestamp="2026-01-20 18:44:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:44:28.333229885 +0000 UTC m=+861.255042919" watchObservedRunningTime="2026-01-20 18:44:36.367167011 +0000 UTC m=+869.288980055" Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.370768 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" event={"ID":"05a83b70-ac51-4951-92c6-0f90265f2958","Type":"ContainerStarted","Data":"f23c5b5346cff941af97bdb2e657e059edf290c29fb99a43ba3a4ef0127ad738"} Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.370816 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.372607 4773 generic.go:334] "Generic (PLEG): container finished" podID="859ada1b-1a7b-4032-a974-2ec3571aa069" containerID="24be42e4f07a05c76c2370acb5fc7583453c39d07e696c13462480c42ac89128" exitCode=0 Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.372839 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerDied","Data":"24be42e4f07a05c76c2370acb5fc7583453c39d07e696c13462480c42ac89128"} Jan 20 18:44:36 crc kubenswrapper[4773]: I0120 18:44:36.392186 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" podStartSLOduration=1.8882776940000001 podStartE2EDuration="11.392165343s" podCreationTimestamp="2026-01-20 18:44:25 +0000 UTC" firstStartedPulling="2026-01-20 18:44:25.85926128 +0000 UTC m=+858.781074304" lastFinishedPulling="2026-01-20 18:44:35.363148929 +0000 UTC m=+868.284961953" observedRunningTime="2026-01-20 18:44:36.389994472 +0000 UTC m=+869.311807496" watchObservedRunningTime="2026-01-20 18:44:36.392165343 +0000 UTC m=+869.313978367" Jan 20 18:44:37 crc kubenswrapper[4773]: I0120 18:44:37.216672 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-7xdr9" Jan 20 18:44:37 crc kubenswrapper[4773]: I0120 18:44:37.379850 4773 generic.go:334] "Generic (PLEG): container finished" podID="859ada1b-1a7b-4032-a974-2ec3571aa069" containerID="10a5291e038952eedd7f5950b9c7d6d2287f60283c0b83278e7b1ab35d31df53" exitCode=0 Jan 20 18:44:37 crc kubenswrapper[4773]: I0120 18:44:37.379895 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerDied","Data":"10a5291e038952eedd7f5950b9c7d6d2287f60283c0b83278e7b1ab35d31df53"} Jan 20 18:44:38 crc kubenswrapper[4773]: I0120 18:44:38.387302 4773 generic.go:334] "Generic (PLEG): container finished" podID="859ada1b-1a7b-4032-a974-2ec3571aa069" containerID="fe4e1fe6ec01f7f1441ba1d107351c60710ba3dc737abceb56e8889409723eda" exitCode=0 Jan 20 18:44:38 crc kubenswrapper[4773]: I0120 18:44:38.387379 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerDied","Data":"fe4e1fe6ec01f7f1441ba1d107351c60710ba3dc737abceb56e8889409723eda"} Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.397495 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"3d0f0ec31622fd84028344d2f5896e32bbf84545cc61e5fdb0a52e6ec1da01f7"} Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.845504 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-nbdm9"] Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.846368 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.848547 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-q6ctx" Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.849713 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.849943 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 20 18:44:39 crc kubenswrapper[4773]: I0120 18:44:39.863759 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nbdm9"] Jan 20 18:44:40 crc kubenswrapper[4773]: I0120 18:44:40.037044 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n88pc\" (UniqueName: \"kubernetes.io/projected/b31a7f2a-24a2-429d-b654-9d87755f5812-kube-api-access-n88pc\") pod \"openstack-operator-index-nbdm9\" (UID: \"b31a7f2a-24a2-429d-b654-9d87755f5812\") " pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:40 crc kubenswrapper[4773]: I0120 18:44:40.138683 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n88pc\" (UniqueName: \"kubernetes.io/projected/b31a7f2a-24a2-429d-b654-9d87755f5812-kube-api-access-n88pc\") pod \"openstack-operator-index-nbdm9\" (UID: \"b31a7f2a-24a2-429d-b654-9d87755f5812\") " pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:40 crc kubenswrapper[4773]: I0120 18:44:40.157348 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n88pc\" (UniqueName: \"kubernetes.io/projected/b31a7f2a-24a2-429d-b654-9d87755f5812-kube-api-access-n88pc\") pod \"openstack-operator-index-nbdm9\" (UID: \"b31a7f2a-24a2-429d-b654-9d87755f5812\") " pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:40 crc kubenswrapper[4773]: I0120 18:44:40.164906 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:40 crc kubenswrapper[4773]: I0120 18:44:40.638261 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-nbdm9"] Jan 20 18:44:40 crc kubenswrapper[4773]: W0120 18:44:40.646278 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb31a7f2a_24a2_429d_b654_9d87755f5812.slice/crio-18c5ee89fccc2fd7dc8e71dabdfb8767df873b1952c66954c8bcf1b9e9b0ac27 WatchSource:0}: Error finding container 18c5ee89fccc2fd7dc8e71dabdfb8767df873b1952c66954c8bcf1b9e9b0ac27: Status 404 returned error can't find the container with id 18c5ee89fccc2fd7dc8e71dabdfb8767df873b1952c66954c8bcf1b9e9b0ac27 Jan 20 18:44:41 crc kubenswrapper[4773]: I0120 18:44:41.424664 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"9ca0d3a8e2cbd40a33d56516cee7c0779c2441cfa13711a674197f4c23b87316"} Jan 20 18:44:41 crc kubenswrapper[4773]: I0120 18:44:41.425027 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"ba81d11f156fbd8d5277da65419d02dbe9369356c1857746cca5f15949b7c342"} Jan 20 18:44:41 crc kubenswrapper[4773]: I0120 18:44:41.425044 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"c89abc383aa3f4fcf81697886b45bfc9a36fd36a3d93cf2314a2a27367653cd4"} Jan 20 18:44:41 crc kubenswrapper[4773]: I0120 18:44:41.425054 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"202e627e34c3563d7cda61eb9aaab331de1087b86e0de9d10c0eb56feb80b0e6"} Jan 20 18:44:41 crc kubenswrapper[4773]: I0120 18:44:41.426603 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nbdm9" event={"ID":"b31a7f2a-24a2-429d-b654-9d87755f5812","Type":"ContainerStarted","Data":"18c5ee89fccc2fd7dc8e71dabdfb8767df873b1952c66954c8bcf1b9e9b0ac27"} Jan 20 18:44:42 crc kubenswrapper[4773]: I0120 18:44:42.436263 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-l2dcx" event={"ID":"859ada1b-1a7b-4032-a974-2ec3571aa069","Type":"ContainerStarted","Data":"5080ec2ca4f81b6d3591ef2e22910c427a22346c4648de2af8f453c4e2553dd4"} Jan 20 18:44:42 crc kubenswrapper[4773]: I0120 18:44:42.437228 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:42 crc kubenswrapper[4773]: I0120 18:44:42.458699 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-l2dcx" podStartSLOduration=8.530480219 podStartE2EDuration="17.458683091s" podCreationTimestamp="2026-01-20 18:44:25 +0000 UTC" firstStartedPulling="2026-01-20 18:44:26.406128624 +0000 UTC m=+859.327941648" lastFinishedPulling="2026-01-20 18:44:35.334331496 +0000 UTC m=+868.256144520" observedRunningTime="2026-01-20 18:44:42.458171799 +0000 UTC m=+875.379984843" watchObservedRunningTime="2026-01-20 18:44:42.458683091 +0000 UTC m=+875.380496115" Jan 20 18:44:42 crc kubenswrapper[4773]: I0120 18:44:42.618800 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nbdm9"] Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.226145 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zjdsq"] Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.227342 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.240351 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zjdsq"] Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.377012 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqndz\" (UniqueName: \"kubernetes.io/projected/ba4d7dc5-ceca-4e4b-81af-9368937b7462-kube-api-access-vqndz\") pod \"openstack-operator-index-zjdsq\" (UID: \"ba4d7dc5-ceca-4e4b-81af-9368937b7462\") " pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.442889 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nbdm9" event={"ID":"b31a7f2a-24a2-429d-b654-9d87755f5812","Type":"ContainerStarted","Data":"af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e"} Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.443227 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-nbdm9" podUID="b31a7f2a-24a2-429d-b654-9d87755f5812" containerName="registry-server" containerID="cri-o://af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e" gracePeriod=2 Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.462459 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-nbdm9" podStartSLOduration=2.492377816 podStartE2EDuration="4.462426925s" podCreationTimestamp="2026-01-20 18:44:39 +0000 UTC" firstStartedPulling="2026-01-20 18:44:40.648345361 +0000 UTC m=+873.570158385" lastFinishedPulling="2026-01-20 18:44:42.61839447 +0000 UTC m=+875.540207494" observedRunningTime="2026-01-20 18:44:43.459442925 +0000 UTC m=+876.381256029" watchObservedRunningTime="2026-01-20 18:44:43.462426925 +0000 UTC m=+876.384240039" Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.478979 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqndz\" (UniqueName: \"kubernetes.io/projected/ba4d7dc5-ceca-4e4b-81af-9368937b7462-kube-api-access-vqndz\") pod \"openstack-operator-index-zjdsq\" (UID: \"ba4d7dc5-ceca-4e4b-81af-9368937b7462\") " pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.497925 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqndz\" (UniqueName: \"kubernetes.io/projected/ba4d7dc5-ceca-4e4b-81af-9368937b7462-kube-api-access-vqndz\") pod \"openstack-operator-index-zjdsq\" (UID: \"ba4d7dc5-ceca-4e4b-81af-9368937b7462\") " pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.549690 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:43 crc kubenswrapper[4773]: I0120 18:44:43.929658 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zjdsq"] Jan 20 18:44:43 crc kubenswrapper[4773]: W0120 18:44:43.940819 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba4d7dc5_ceca_4e4b_81af_9368937b7462.slice/crio-b2d273ab06fee20d1fbccf64948b032ef44dbe3b631a1f381f32cf538021e95e WatchSource:0}: Error finding container b2d273ab06fee20d1fbccf64948b032ef44dbe3b631a1f381f32cf538021e95e: Status 404 returned error can't find the container with id b2d273ab06fee20d1fbccf64948b032ef44dbe3b631a1f381f32cf538021e95e Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.286783 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.392139 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n88pc\" (UniqueName: \"kubernetes.io/projected/b31a7f2a-24a2-429d-b654-9d87755f5812-kube-api-access-n88pc\") pod \"b31a7f2a-24a2-429d-b654-9d87755f5812\" (UID: \"b31a7f2a-24a2-429d-b654-9d87755f5812\") " Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.397996 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b31a7f2a-24a2-429d-b654-9d87755f5812-kube-api-access-n88pc" (OuterVolumeSpecName: "kube-api-access-n88pc") pod "b31a7f2a-24a2-429d-b654-9d87755f5812" (UID: "b31a7f2a-24a2-429d-b654-9d87755f5812"). InnerVolumeSpecName "kube-api-access-n88pc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.449032 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zjdsq" event={"ID":"ba4d7dc5-ceca-4e4b-81af-9368937b7462","Type":"ContainerStarted","Data":"31cb2d99855fb582c69bb88857d6f3426e1654fa86bcd739551bbc96d160eeb5"} Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.449073 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zjdsq" event={"ID":"ba4d7dc5-ceca-4e4b-81af-9368937b7462","Type":"ContainerStarted","Data":"b2d273ab06fee20d1fbccf64948b032ef44dbe3b631a1f381f32cf538021e95e"} Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.450570 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-nbdm9" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.450550 4773 generic.go:334] "Generic (PLEG): container finished" podID="b31a7f2a-24a2-429d-b654-9d87755f5812" containerID="af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e" exitCode=0 Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.450626 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nbdm9" event={"ID":"b31a7f2a-24a2-429d-b654-9d87755f5812","Type":"ContainerDied","Data":"af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e"} Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.450652 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-nbdm9" event={"ID":"b31a7f2a-24a2-429d-b654-9d87755f5812","Type":"ContainerDied","Data":"18c5ee89fccc2fd7dc8e71dabdfb8767df873b1952c66954c8bcf1b9e9b0ac27"} Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.450669 4773 scope.go:117] "RemoveContainer" containerID="af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.469609 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zjdsq" podStartSLOduration=1.421358425 podStartE2EDuration="1.46959172s" podCreationTimestamp="2026-01-20 18:44:43 +0000 UTC" firstStartedPulling="2026-01-20 18:44:43.944707997 +0000 UTC m=+876.866521041" lastFinishedPulling="2026-01-20 18:44:43.992941292 +0000 UTC m=+876.914754336" observedRunningTime="2026-01-20 18:44:44.46411443 +0000 UTC m=+877.385927464" watchObservedRunningTime="2026-01-20 18:44:44.46959172 +0000 UTC m=+877.391404744" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.475590 4773 scope.go:117] "RemoveContainer" containerID="af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e" Jan 20 18:44:44 crc kubenswrapper[4773]: E0120 18:44:44.476054 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e\": container with ID starting with af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e not found: ID does not exist" containerID="af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.476202 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e"} err="failed to get container status \"af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e\": rpc error: code = NotFound desc = could not find container \"af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e\": container with ID starting with af25e1bdfd2a3a3e7e25147c0f46a25257600d4bd84c4af8c85fd53fda9dcb4e not found: ID does not exist" Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.480815 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-nbdm9"] Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.484722 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-nbdm9"] Jan 20 18:44:44 crc kubenswrapper[4773]: I0120 18:44:44.494012 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n88pc\" (UniqueName: \"kubernetes.io/projected/b31a7f2a-24a2-429d-b654-9d87755f5812-kube-api-access-n88pc\") on node \"crc\" DevicePath \"\"" Jan 20 18:44:45 crc kubenswrapper[4773]: I0120 18:44:45.458030 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b31a7f2a-24a2-429d-b654-9d87755f5812" path="/var/lib/kubelet/pods/b31a7f2a-24a2-429d-b654-9d87755f5812/volumes" Jan 20 18:44:45 crc kubenswrapper[4773]: I0120 18:44:45.631441 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-vv9kz" Jan 20 18:44:46 crc kubenswrapper[4773]: I0120 18:44:46.234370 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:46 crc kubenswrapper[4773]: I0120 18:44:46.269733 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:44:53 crc kubenswrapper[4773]: I0120 18:44:53.551304 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:53 crc kubenswrapper[4773]: I0120 18:44:53.551665 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:53 crc kubenswrapper[4773]: I0120 18:44:53.575917 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:54 crc kubenswrapper[4773]: I0120 18:44:54.543994 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-zjdsq" Jan 20 18:44:56 crc kubenswrapper[4773]: I0120 18:44:56.237609 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-l2dcx" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.161661 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8"] Jan 20 18:45:00 crc kubenswrapper[4773]: E0120 18:45:00.162214 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b31a7f2a-24a2-429d-b654-9d87755f5812" containerName="registry-server" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.162230 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b31a7f2a-24a2-429d-b654-9d87755f5812" containerName="registry-server" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.162365 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b31a7f2a-24a2-429d-b654-9d87755f5812" containerName="registry-server" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.162761 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.165114 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.165213 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.170098 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8"] Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.301963 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzl9g\" (UniqueName: \"kubernetes.io/projected/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-kube-api-access-wzl9g\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.302014 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-secret-volume\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.302037 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-config-volume\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.403005 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wzl9g\" (UniqueName: \"kubernetes.io/projected/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-kube-api-access-wzl9g\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.403050 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-secret-volume\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.403076 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-config-volume\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.403847 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-config-volume\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.409040 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-secret-volume\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.419000 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzl9g\" (UniqueName: \"kubernetes.io/projected/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-kube-api-access-wzl9g\") pod \"collect-profiles-29482245-g5jp8\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.477585 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:00 crc kubenswrapper[4773]: I0120 18:45:00.881417 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8"] Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.098079 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp"] Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.099792 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.104095 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-hhr9t" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.143337 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp"] Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.219879 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-util\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.220488 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-bundle\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.220533 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tkrv\" (UniqueName: \"kubernetes.io/projected/7af94832-1f61-43d7-9c56-bee4b2893499-kube-api-access-7tkrv\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.322570 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-util\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.322669 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-bundle\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.322720 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tkrv\" (UniqueName: \"kubernetes.io/projected/7af94832-1f61-43d7-9c56-bee4b2893499-kube-api-access-7tkrv\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.323646 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-util\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.323752 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-bundle\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.343395 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tkrv\" (UniqueName: \"kubernetes.io/projected/7af94832-1f61-43d7-9c56-bee4b2893499-kube-api-access-7tkrv\") pod \"1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.415454 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.570671 4773 generic.go:334] "Generic (PLEG): container finished" podID="a10b40f1-a7af-4ef6-ac5d-104e09a494d9" containerID="5831be469a4fe2b76e1bccd6344f54cbf800b2124dcc48460a5c9ae662bb240a" exitCode=0 Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.570717 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" event={"ID":"a10b40f1-a7af-4ef6-ac5d-104e09a494d9","Type":"ContainerDied","Data":"5831be469a4fe2b76e1bccd6344f54cbf800b2124dcc48460a5c9ae662bb240a"} Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.570777 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" event={"ID":"a10b40f1-a7af-4ef6-ac5d-104e09a494d9","Type":"ContainerStarted","Data":"756c29baa988fe6135249d0d3ce51053c12085bbd50e5ebf33b9880861d8bf8c"} Jan 20 18:45:01 crc kubenswrapper[4773]: I0120 18:45:01.641787 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp"] Jan 20 18:45:01 crc kubenswrapper[4773]: W0120 18:45:01.647170 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7af94832_1f61_43d7_9c56_bee4b2893499.slice/crio-35992aac50bddc625f77e223078d241078c98a712d55a801c259fab1cdbba9a0 WatchSource:0}: Error finding container 35992aac50bddc625f77e223078d241078c98a712d55a801c259fab1cdbba9a0: Status 404 returned error can't find the container with id 35992aac50bddc625f77e223078d241078c98a712d55a801c259fab1cdbba9a0 Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.581824 4773 generic.go:334] "Generic (PLEG): container finished" podID="7af94832-1f61-43d7-9c56-bee4b2893499" containerID="80a792e8b4d2fb6110c2b6e18c2dfba91dd588be2f16ec01190c2c2dd2b34518" exitCode=0 Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.581944 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" event={"ID":"7af94832-1f61-43d7-9c56-bee4b2893499","Type":"ContainerDied","Data":"80a792e8b4d2fb6110c2b6e18c2dfba91dd588be2f16ec01190c2c2dd2b34518"} Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.582281 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" event={"ID":"7af94832-1f61-43d7-9c56-bee4b2893499","Type":"ContainerStarted","Data":"35992aac50bddc625f77e223078d241078c98a712d55a801c259fab1cdbba9a0"} Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.816857 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.949920 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wzl9g\" (UniqueName: \"kubernetes.io/projected/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-kube-api-access-wzl9g\") pod \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.950016 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-secret-volume\") pod \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.950113 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-config-volume\") pod \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\" (UID: \"a10b40f1-a7af-4ef6-ac5d-104e09a494d9\") " Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.951023 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-config-volume" (OuterVolumeSpecName: "config-volume") pod "a10b40f1-a7af-4ef6-ac5d-104e09a494d9" (UID: "a10b40f1-a7af-4ef6-ac5d-104e09a494d9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.955175 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a10b40f1-a7af-4ef6-ac5d-104e09a494d9" (UID: "a10b40f1-a7af-4ef6-ac5d-104e09a494d9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:45:02 crc kubenswrapper[4773]: I0120 18:45:02.955859 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-kube-api-access-wzl9g" (OuterVolumeSpecName: "kube-api-access-wzl9g") pod "a10b40f1-a7af-4ef6-ac5d-104e09a494d9" (UID: "a10b40f1-a7af-4ef6-ac5d-104e09a494d9"). InnerVolumeSpecName "kube-api-access-wzl9g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.051571 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.051837 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wzl9g\" (UniqueName: \"kubernetes.io/projected/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-kube-api-access-wzl9g\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.051924 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a10b40f1-a7af-4ef6-ac5d-104e09a494d9-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.590505 4773 generic.go:334] "Generic (PLEG): container finished" podID="7af94832-1f61-43d7-9c56-bee4b2893499" containerID="d6c65610baa679afefd44d4caeaf591a2ae640a1333da42d63ebea862cecba29" exitCode=0 Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.590582 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" event={"ID":"7af94832-1f61-43d7-9c56-bee4b2893499","Type":"ContainerDied","Data":"d6c65610baa679afefd44d4caeaf591a2ae640a1333da42d63ebea862cecba29"} Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.592067 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" event={"ID":"a10b40f1-a7af-4ef6-ac5d-104e09a494d9","Type":"ContainerDied","Data":"756c29baa988fe6135249d0d3ce51053c12085bbd50e5ebf33b9880861d8bf8c"} Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.592351 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="756c29baa988fe6135249d0d3ce51053c12085bbd50e5ebf33b9880861d8bf8c" Jan 20 18:45:03 crc kubenswrapper[4773]: I0120 18:45:03.592137 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8" Jan 20 18:45:04 crc kubenswrapper[4773]: I0120 18:45:04.600392 4773 generic.go:334] "Generic (PLEG): container finished" podID="7af94832-1f61-43d7-9c56-bee4b2893499" containerID="9eb398dc71f25470e2af328a2029d77747e5efc30044ab80307ded8d3dd0cc4d" exitCode=0 Jan 20 18:45:04 crc kubenswrapper[4773]: I0120 18:45:04.600441 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" event={"ID":"7af94832-1f61-43d7-9c56-bee4b2893499","Type":"ContainerDied","Data":"9eb398dc71f25470e2af328a2029d77747e5efc30044ab80307ded8d3dd0cc4d"} Jan 20 18:45:05 crc kubenswrapper[4773]: I0120 18:45:05.851613 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:05 crc kubenswrapper[4773]: I0120 18:45:05.992373 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tkrv\" (UniqueName: \"kubernetes.io/projected/7af94832-1f61-43d7-9c56-bee4b2893499-kube-api-access-7tkrv\") pod \"7af94832-1f61-43d7-9c56-bee4b2893499\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " Jan 20 18:45:05 crc kubenswrapper[4773]: I0120 18:45:05.992457 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-bundle\") pod \"7af94832-1f61-43d7-9c56-bee4b2893499\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " Jan 20 18:45:05 crc kubenswrapper[4773]: I0120 18:45:05.992495 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-util\") pod \"7af94832-1f61-43d7-9c56-bee4b2893499\" (UID: \"7af94832-1f61-43d7-9c56-bee4b2893499\") " Jan 20 18:45:05 crc kubenswrapper[4773]: I0120 18:45:05.993345 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-bundle" (OuterVolumeSpecName: "bundle") pod "7af94832-1f61-43d7-9c56-bee4b2893499" (UID: "7af94832-1f61-43d7-9c56-bee4b2893499"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:45:05 crc kubenswrapper[4773]: I0120 18:45:05.998917 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7af94832-1f61-43d7-9c56-bee4b2893499-kube-api-access-7tkrv" (OuterVolumeSpecName: "kube-api-access-7tkrv") pod "7af94832-1f61-43d7-9c56-bee4b2893499" (UID: "7af94832-1f61-43d7-9c56-bee4b2893499"). InnerVolumeSpecName "kube-api-access-7tkrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.006295 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-util" (OuterVolumeSpecName: "util") pod "7af94832-1f61-43d7-9c56-bee4b2893499" (UID: "7af94832-1f61-43d7-9c56-bee4b2893499"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.094140 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tkrv\" (UniqueName: \"kubernetes.io/projected/7af94832-1f61-43d7-9c56-bee4b2893499-kube-api-access-7tkrv\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.094186 4773 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.094195 4773 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7af94832-1f61-43d7-9c56-bee4b2893499-util\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.613233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" event={"ID":"7af94832-1f61-43d7-9c56-bee4b2893499","Type":"ContainerDied","Data":"35992aac50bddc625f77e223078d241078c98a712d55a801c259fab1cdbba9a0"} Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.613487 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="35992aac50bddc625f77e223078d241078c98a712d55a801c259fab1cdbba9a0" Jan 20 18:45:06 crc kubenswrapper[4773]: I0120 18:45:06.613488 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.670441 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb"] Jan 20 18:45:13 crc kubenswrapper[4773]: E0120 18:45:13.671498 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="extract" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.671513 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="extract" Jan 20 18:45:13 crc kubenswrapper[4773]: E0120 18:45:13.671527 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="pull" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.671535 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="pull" Jan 20 18:45:13 crc kubenswrapper[4773]: E0120 18:45:13.671555 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="util" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.671562 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="util" Jan 20 18:45:13 crc kubenswrapper[4773]: E0120 18:45:13.671579 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a10b40f1-a7af-4ef6-ac5d-104e09a494d9" containerName="collect-profiles" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.671587 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a10b40f1-a7af-4ef6-ac5d-104e09a494d9" containerName="collect-profiles" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.671903 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7af94832-1f61-43d7-9c56-bee4b2893499" containerName="extract" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.671917 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a10b40f1-a7af-4ef6-ac5d-104e09a494d9" containerName="collect-profiles" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.672455 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.675210 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-kcdp2" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.697238 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb"] Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.700223 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-77g4q\" (UniqueName: \"kubernetes.io/projected/e2d598ad-b9fa-4874-8669-688e18171e82-kube-api-access-77g4q\") pod \"openstack-operator-controller-init-5df999bcf5-pztzb\" (UID: \"e2d598ad-b9fa-4874-8669-688e18171e82\") " pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.801778 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-77g4q\" (UniqueName: \"kubernetes.io/projected/e2d598ad-b9fa-4874-8669-688e18171e82-kube-api-access-77g4q\") pod \"openstack-operator-controller-init-5df999bcf5-pztzb\" (UID: \"e2d598ad-b9fa-4874-8669-688e18171e82\") " pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.819350 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-77g4q\" (UniqueName: \"kubernetes.io/projected/e2d598ad-b9fa-4874-8669-688e18171e82-kube-api-access-77g4q\") pod \"openstack-operator-controller-init-5df999bcf5-pztzb\" (UID: \"e2d598ad-b9fa-4874-8669-688e18171e82\") " pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:13 crc kubenswrapper[4773]: I0120 18:45:13.990706 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:14 crc kubenswrapper[4773]: I0120 18:45:14.428507 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb"] Jan 20 18:45:14 crc kubenswrapper[4773]: I0120 18:45:14.668877 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" event={"ID":"e2d598ad-b9fa-4874-8669-688e18171e82","Type":"ContainerStarted","Data":"c42e2500d599fdfe3654d8d789dd7a2fedce4d30bedbec7ab2fd053c0a49e225"} Jan 20 18:45:19 crc kubenswrapper[4773]: I0120 18:45:19.721230 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" event={"ID":"e2d598ad-b9fa-4874-8669-688e18171e82","Type":"ContainerStarted","Data":"4c1017e21a7625c060953cf413a3f1b3e7cfa23d20bd2908dd0e6748c717ff7c"} Jan 20 18:45:19 crc kubenswrapper[4773]: I0120 18:45:19.721800 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:19 crc kubenswrapper[4773]: I0120 18:45:19.749377 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" podStartSLOduration=2.404513313 podStartE2EDuration="6.749359924s" podCreationTimestamp="2026-01-20 18:45:13 +0000 UTC" firstStartedPulling="2026-01-20 18:45:14.436480666 +0000 UTC m=+907.358293690" lastFinishedPulling="2026-01-20 18:45:18.781327277 +0000 UTC m=+911.703140301" observedRunningTime="2026-01-20 18:45:19.743210178 +0000 UTC m=+912.665023202" watchObservedRunningTime="2026-01-20 18:45:19.749359924 +0000 UTC m=+912.671172948" Jan 20 18:45:23 crc kubenswrapper[4773]: I0120 18:45:23.993850 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-5df999bcf5-pztzb" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.753786 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-tz7cr"] Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.755203 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.767533 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tz7cr"] Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.863879 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-utilities\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.863949 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-catalog-content\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.864156 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-njzb4\" (UniqueName: \"kubernetes.io/projected/0bf6917d-4c23-4e7c-8969-822309492cfb-kube-api-access-njzb4\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.965267 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-utilities\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.965322 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-catalog-content\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.965386 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-njzb4\" (UniqueName: \"kubernetes.io/projected/0bf6917d-4c23-4e7c-8969-822309492cfb-kube-api-access-njzb4\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.965838 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-utilities\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.966002 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-catalog-content\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:25 crc kubenswrapper[4773]: I0120 18:45:25.987596 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-njzb4\" (UniqueName: \"kubernetes.io/projected/0bf6917d-4c23-4e7c-8969-822309492cfb-kube-api-access-njzb4\") pod \"community-operators-tz7cr\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:26 crc kubenswrapper[4773]: I0120 18:45:26.072387 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:26 crc kubenswrapper[4773]: I0120 18:45:26.308846 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-tz7cr"] Jan 20 18:45:26 crc kubenswrapper[4773]: I0120 18:45:26.772241 4773 generic.go:334] "Generic (PLEG): container finished" podID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerID="4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c" exitCode=0 Jan 20 18:45:26 crc kubenswrapper[4773]: I0120 18:45:26.772346 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz7cr" event={"ID":"0bf6917d-4c23-4e7c-8969-822309492cfb","Type":"ContainerDied","Data":"4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c"} Jan 20 18:45:26 crc kubenswrapper[4773]: I0120 18:45:26.772411 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz7cr" event={"ID":"0bf6917d-4c23-4e7c-8969-822309492cfb","Type":"ContainerStarted","Data":"e2e74be584b696f3c79864fc6b26b1d3bf64e57b5e05f544cf9d23da32d16275"} Jan 20 18:45:28 crc kubenswrapper[4773]: I0120 18:45:28.169801 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:45:28 crc kubenswrapper[4773]: I0120 18:45:28.170288 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:45:29 crc kubenswrapper[4773]: I0120 18:45:29.795958 4773 generic.go:334] "Generic (PLEG): container finished" podID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerID="dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf" exitCode=0 Jan 20 18:45:29 crc kubenswrapper[4773]: I0120 18:45:29.796011 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz7cr" event={"ID":"0bf6917d-4c23-4e7c-8969-822309492cfb","Type":"ContainerDied","Data":"dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf"} Jan 20 18:45:30 crc kubenswrapper[4773]: I0120 18:45:30.804480 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz7cr" event={"ID":"0bf6917d-4c23-4e7c-8969-822309492cfb","Type":"ContainerStarted","Data":"e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9"} Jan 20 18:45:30 crc kubenswrapper[4773]: I0120 18:45:30.821639 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-tz7cr" podStartSLOduration=2.15320677 podStartE2EDuration="5.821621924s" podCreationTimestamp="2026-01-20 18:45:25 +0000 UTC" firstStartedPulling="2026-01-20 18:45:26.774547956 +0000 UTC m=+919.696360980" lastFinishedPulling="2026-01-20 18:45:30.4429631 +0000 UTC m=+923.364776134" observedRunningTime="2026-01-20 18:45:30.81981886 +0000 UTC m=+923.741631904" watchObservedRunningTime="2026-01-20 18:45:30.821621924 +0000 UTC m=+923.743434948" Jan 20 18:45:36 crc kubenswrapper[4773]: I0120 18:45:36.074352 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:36 crc kubenswrapper[4773]: I0120 18:45:36.074870 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:36 crc kubenswrapper[4773]: I0120 18:45:36.111358 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:36 crc kubenswrapper[4773]: I0120 18:45:36.893598 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:38 crc kubenswrapper[4773]: I0120 18:45:38.349030 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tz7cr"] Jan 20 18:45:38 crc kubenswrapper[4773]: I0120 18:45:38.860993 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-tz7cr" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="registry-server" containerID="cri-o://e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9" gracePeriod=2 Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.833972 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.871329 4773 generic.go:334] "Generic (PLEG): container finished" podID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerID="e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9" exitCode=0 Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.871408 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-tz7cr" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.871409 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz7cr" event={"ID":"0bf6917d-4c23-4e7c-8969-822309492cfb","Type":"ContainerDied","Data":"e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9"} Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.871558 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-tz7cr" event={"ID":"0bf6917d-4c23-4e7c-8969-822309492cfb","Type":"ContainerDied","Data":"e2e74be584b696f3c79864fc6b26b1d3bf64e57b5e05f544cf9d23da32d16275"} Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.871580 4773 scope.go:117] "RemoveContainer" containerID="e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.898665 4773 scope.go:117] "RemoveContainer" containerID="dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.923510 4773 scope.go:117] "RemoveContainer" containerID="4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.942844 4773 scope.go:117] "RemoveContainer" containerID="e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9" Jan 20 18:45:39 crc kubenswrapper[4773]: E0120 18:45:39.943748 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9\": container with ID starting with e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9 not found: ID does not exist" containerID="e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.943787 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9"} err="failed to get container status \"e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9\": rpc error: code = NotFound desc = could not find container \"e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9\": container with ID starting with e9c98af5c773e820be7a8444773310c692cdbae44f9ce415f11acd85b44f1bd9 not found: ID does not exist" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.943814 4773 scope.go:117] "RemoveContainer" containerID="dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf" Jan 20 18:45:39 crc kubenswrapper[4773]: E0120 18:45:39.944163 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf\": container with ID starting with dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf not found: ID does not exist" containerID="dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.944192 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf"} err="failed to get container status \"dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf\": rpc error: code = NotFound desc = could not find container \"dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf\": container with ID starting with dd8478b39d7704adb175f07bf58714724b8d70c8cd53cfb3b9705b66192ec7bf not found: ID does not exist" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.944210 4773 scope.go:117] "RemoveContainer" containerID="4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c" Jan 20 18:45:39 crc kubenswrapper[4773]: E0120 18:45:39.944513 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c\": container with ID starting with 4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c not found: ID does not exist" containerID="4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.944542 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c"} err="failed to get container status \"4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c\": rpc error: code = NotFound desc = could not find container \"4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c\": container with ID starting with 4ddca14adbce5be072f9b92b87a5667402e713114719422e02ba35474806682c not found: ID does not exist" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.966024 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-utilities\") pod \"0bf6917d-4c23-4e7c-8969-822309492cfb\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.966116 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-njzb4\" (UniqueName: \"kubernetes.io/projected/0bf6917d-4c23-4e7c-8969-822309492cfb-kube-api-access-njzb4\") pod \"0bf6917d-4c23-4e7c-8969-822309492cfb\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.966161 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-catalog-content\") pod \"0bf6917d-4c23-4e7c-8969-822309492cfb\" (UID: \"0bf6917d-4c23-4e7c-8969-822309492cfb\") " Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.966981 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-utilities" (OuterVolumeSpecName: "utilities") pod "0bf6917d-4c23-4e7c-8969-822309492cfb" (UID: "0bf6917d-4c23-4e7c-8969-822309492cfb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:45:39 crc kubenswrapper[4773]: I0120 18:45:39.973429 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bf6917d-4c23-4e7c-8969-822309492cfb-kube-api-access-njzb4" (OuterVolumeSpecName: "kube-api-access-njzb4") pod "0bf6917d-4c23-4e7c-8969-822309492cfb" (UID: "0bf6917d-4c23-4e7c-8969-822309492cfb"). InnerVolumeSpecName "kube-api-access-njzb4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:45:40 crc kubenswrapper[4773]: I0120 18:45:40.025250 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0bf6917d-4c23-4e7c-8969-822309492cfb" (UID: "0bf6917d-4c23-4e7c-8969-822309492cfb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:45:40 crc kubenswrapper[4773]: I0120 18:45:40.068576 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:40 crc kubenswrapper[4773]: I0120 18:45:40.068622 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-njzb4\" (UniqueName: \"kubernetes.io/projected/0bf6917d-4c23-4e7c-8969-822309492cfb-kube-api-access-njzb4\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:40 crc kubenswrapper[4773]: I0120 18:45:40.068637 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0bf6917d-4c23-4e7c-8969-822309492cfb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:45:40 crc kubenswrapper[4773]: I0120 18:45:40.210828 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-tz7cr"] Jan 20 18:45:40 crc kubenswrapper[4773]: I0120 18:45:40.216195 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-tz7cr"] Jan 20 18:45:41 crc kubenswrapper[4773]: I0120 18:45:41.456524 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" path="/var/lib/kubelet/pods/0bf6917d-4c23-4e7c-8969-822309492cfb/volumes" Jan 20 18:45:58 crc kubenswrapper[4773]: I0120 18:45:58.170304 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:45:58 crc kubenswrapper[4773]: I0120 18:45:58.170903 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.700165 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc"] Jan 20 18:46:01 crc kubenswrapper[4773]: E0120 18:46:01.700808 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="registry-server" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.700829 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="registry-server" Jan 20 18:46:01 crc kubenswrapper[4773]: E0120 18:46:01.700861 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="extract-content" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.700871 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="extract-content" Jan 20 18:46:01 crc kubenswrapper[4773]: E0120 18:46:01.700897 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="extract-utilities" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.700908 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="extract-utilities" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.701101 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0bf6917d-4c23-4e7c-8969-822309492cfb" containerName="registry-server" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.701601 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.703396 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-45r8r" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.706894 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.707995 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.711551 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-2j2hv" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.719474 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.724651 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.733196 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.734288 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.737331 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-5khm4" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.749628 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.754142 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqbnd\" (UniqueName: \"kubernetes.io/projected/df2d6d5b-b964-4672-903f-563b7792ee43-kube-api-access-bqbnd\") pod \"barbican-operator-controller-manager-7ddb5c749-hhxlp\" (UID: \"df2d6d5b-b964-4672-903f-563b7792ee43\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.754209 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf6fs\" (UniqueName: \"kubernetes.io/projected/48aacb32-c120-4f36-898b-60f5d01c5510-kube-api-access-mf6fs\") pod \"cinder-operator-controller-manager-9b68f5989-xmljc\" (UID: \"48aacb32-c120-4f36-898b-60f5d01c5510\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.754254 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vrnm\" (UniqueName: \"kubernetes.io/projected/ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7-kube-api-access-6vrnm\") pod \"designate-operator-controller-manager-9f958b845-v4q7f\" (UID: \"ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.768125 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.768850 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.771708 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-r8n8d" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.781738 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.782827 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.786783 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-qn28m" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.796299 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.799304 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.819619 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.830479 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.839524 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-nx6wx" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.855387 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vrnm\" (UniqueName: \"kubernetes.io/projected/ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7-kube-api-access-6vrnm\") pod \"designate-operator-controller-manager-9f958b845-v4q7f\" (UID: \"ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.855446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqbnd\" (UniqueName: \"kubernetes.io/projected/df2d6d5b-b964-4672-903f-563b7792ee43-kube-api-access-bqbnd\") pod \"barbican-operator-controller-manager-7ddb5c749-hhxlp\" (UID: \"df2d6d5b-b964-4672-903f-563b7792ee43\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.855503 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf6fs\" (UniqueName: \"kubernetes.io/projected/48aacb32-c120-4f36-898b-60f5d01c5510-kube-api-access-mf6fs\") pod \"cinder-operator-controller-manager-9b68f5989-xmljc\" (UID: \"48aacb32-c120-4f36-898b-60f5d01c5510\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.862011 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.873972 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.874775 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.890783 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf6fs\" (UniqueName: \"kubernetes.io/projected/48aacb32-c120-4f36-898b-60f5d01c5510-kube-api-access-mf6fs\") pod \"cinder-operator-controller-manager-9b68f5989-xmljc\" (UID: \"48aacb32-c120-4f36-898b-60f5d01c5510\") " pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.892674 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hr4gt" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.892839 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.892971 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.894891 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.906651 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqbnd\" (UniqueName: \"kubernetes.io/projected/df2d6d5b-b964-4672-903f-563b7792ee43-kube-api-access-bqbnd\") pod \"barbican-operator-controller-manager-7ddb5c749-hhxlp\" (UID: \"df2d6d5b-b964-4672-903f-563b7792ee43\") " pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.908373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vrnm\" (UniqueName: \"kubernetes.io/projected/ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7-kube-api-access-6vrnm\") pod \"designate-operator-controller-manager-9f958b845-v4q7f\" (UID: \"ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7\") " pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.913622 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.917877 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.923920 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-zb5cc" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.924713 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-gvm5n" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958729 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnx2r\" (UniqueName: \"kubernetes.io/projected/d1051db2-8914-422b-a126-5cd8ee078767-kube-api-access-tnx2r\") pod \"horizon-operator-controller-manager-77d5c5b54f-blxqv\" (UID: \"d1051db2-8914-422b-a126-5cd8ee078767\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958787 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7cd8\" (UniqueName: \"kubernetes.io/projected/951d4f5c-5d89-41c6-be8a-9828b05ce182-kube-api-access-k7cd8\") pod \"ironic-operator-controller-manager-78757b4889-2nhdr\" (UID: \"951d4f5c-5d89-41c6-be8a-9828b05ce182\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958819 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958850 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tfgwd\" (UniqueName: \"kubernetes.io/projected/437cadd4-5809-4b9e-afa2-05832cd6c303-kube-api-access-tfgwd\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958897 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbzzl\" (UniqueName: \"kubernetes.io/projected/b773ecb8-3505-44ad-a28f-bd4054263888-kube-api-access-pbzzl\") pod \"keystone-operator-controller-manager-767fdc4f47-8tsjs\" (UID: \"b773ecb8-3505-44ad-a28f-bd4054263888\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958941 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htkjc\" (UniqueName: \"kubernetes.io/projected/a570d5a5-53f4-444f-a14d-92ea24f27e2e-kube-api-access-htkjc\") pod \"heat-operator-controller-manager-594c8c9d5d-vjfdq\" (UID: \"a570d5a5-53f4-444f-a14d-92ea24f27e2e\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.958986 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdzrg\" (UniqueName: \"kubernetes.io/projected/4604c39e-62d8-4420-b2bc-54d44f4ebcd0-kube-api-access-fdzrg\") pod \"glance-operator-controller-manager-c6994669c-4kk2r\" (UID: \"4604c39e-62d8-4420-b2bc-54d44f4ebcd0\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.975026 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.979297 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.987270 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.989622 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm"] Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.990755 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:01 crc kubenswrapper[4773]: I0120 18:46:01.994981 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.009399 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-j66lb" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.022750 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.037460 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.052239 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.052740 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.053998 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.061032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnx2r\" (UniqueName: \"kubernetes.io/projected/d1051db2-8914-422b-a126-5cd8ee078767-kube-api-access-tnx2r\") pod \"horizon-operator-controller-manager-77d5c5b54f-blxqv\" (UID: \"d1051db2-8914-422b-a126-5cd8ee078767\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.061097 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7cd8\" (UniqueName: \"kubernetes.io/projected/951d4f5c-5d89-41c6-be8a-9828b05ce182-kube-api-access-k7cd8\") pod \"ironic-operator-controller-manager-78757b4889-2nhdr\" (UID: \"951d4f5c-5d89-41c6-be8a-9828b05ce182\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.061125 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.061145 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tfgwd\" (UniqueName: \"kubernetes.io/projected/437cadd4-5809-4b9e-afa2-05832cd6c303-kube-api-access-tfgwd\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.061188 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbzzl\" (UniqueName: \"kubernetes.io/projected/b773ecb8-3505-44ad-a28f-bd4054263888-kube-api-access-pbzzl\") pod \"keystone-operator-controller-manager-767fdc4f47-8tsjs\" (UID: \"b773ecb8-3505-44ad-a28f-bd4054263888\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:02 crc kubenswrapper[4773]: E0120 18:46:02.061862 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:02 crc kubenswrapper[4773]: E0120 18:46:02.061912 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert podName:437cadd4-5809-4b9e-afa2-05832cd6c303 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:02.56189654 +0000 UTC m=+955.483709564 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert") pod "infra-operator-controller-manager-77c48c7859-hqsb9" (UID: "437cadd4-5809-4b9e-afa2-05832cd6c303") : secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.062025 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htkjc\" (UniqueName: \"kubernetes.io/projected/a570d5a5-53f4-444f-a14d-92ea24f27e2e-kube-api-access-htkjc\") pod \"heat-operator-controller-manager-594c8c9d5d-vjfdq\" (UID: \"a570d5a5-53f4-444f-a14d-92ea24f27e2e\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.062114 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdzrg\" (UniqueName: \"kubernetes.io/projected/4604c39e-62d8-4420-b2bc-54d44f4ebcd0-kube-api-access-fdzrg\") pod \"glance-operator-controller-manager-c6994669c-4kk2r\" (UID: \"4604c39e-62d8-4420-b2bc-54d44f4ebcd0\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.068271 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-dbxds" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.083112 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.084060 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.085527 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbzzl\" (UniqueName: \"kubernetes.io/projected/b773ecb8-3505-44ad-a28f-bd4054263888-kube-api-access-pbzzl\") pod \"keystone-operator-controller-manager-767fdc4f47-8tsjs\" (UID: \"b773ecb8-3505-44ad-a28f-bd4054263888\") " pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.089779 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-plc9j" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.095725 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htkjc\" (UniqueName: \"kubernetes.io/projected/a570d5a5-53f4-444f-a14d-92ea24f27e2e-kube-api-access-htkjc\") pod \"heat-operator-controller-manager-594c8c9d5d-vjfdq\" (UID: \"a570d5a5-53f4-444f-a14d-92ea24f27e2e\") " pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.097555 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdzrg\" (UniqueName: \"kubernetes.io/projected/4604c39e-62d8-4420-b2bc-54d44f4ebcd0-kube-api-access-fdzrg\") pod \"glance-operator-controller-manager-c6994669c-4kk2r\" (UID: \"4604c39e-62d8-4420-b2bc-54d44f4ebcd0\") " pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.100118 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7cd8\" (UniqueName: \"kubernetes.io/projected/951d4f5c-5d89-41c6-be8a-9828b05ce182-kube-api-access-k7cd8\") pod \"ironic-operator-controller-manager-78757b4889-2nhdr\" (UID: \"951d4f5c-5d89-41c6-be8a-9828b05ce182\") " pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.109125 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.118009 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.126014 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnx2r\" (UniqueName: \"kubernetes.io/projected/d1051db2-8914-422b-a126-5cd8ee078767-kube-api-access-tnx2r\") pod \"horizon-operator-controller-manager-77d5c5b54f-blxqv\" (UID: \"d1051db2-8914-422b-a126-5cd8ee078767\") " pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.126098 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.148202 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tfgwd\" (UniqueName: \"kubernetes.io/projected/437cadd4-5809-4b9e-afa2-05832cd6c303-kube-api-access-tfgwd\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.159879 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.169497 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncbpd\" (UniqueName: \"kubernetes.io/projected/ed6d3389-b374-42a6-8101-1d34df737170-kube-api-access-ncbpd\") pod \"manila-operator-controller-manager-864f6b75bf-mqjmm\" (UID: \"ed6d3389-b374-42a6-8101-1d34df737170\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.169603 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm7vp\" (UniqueName: \"kubernetes.io/projected/8f795216-0196-4a5a-bfdf-20dee1543b43-kube-api-access-cm7vp\") pod \"mariadb-operator-controller-manager-c87fff755-s7scg\" (UID: \"8f795216-0196-4a5a-bfdf-20dee1543b43\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.192111 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.195250 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.203130 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-h7nkq" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.207068 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-prhbl"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.208213 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.229846 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-85qwb" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.244596 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.248495 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-prhbl"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.271463 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9nm8\" (UniqueName: \"kubernetes.io/projected/ff53e5c0-255a-43c5-a27c-ce9dc3145999-kube-api-access-n9nm8\") pod \"octavia-operator-controller-manager-7fc9b76cf6-sslnl\" (UID: \"ff53e5c0-255a-43c5-a27c-ce9dc3145999\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.271764 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm7vp\" (UniqueName: \"kubernetes.io/projected/8f795216-0196-4a5a-bfdf-20dee1543b43-kube-api-access-cm7vp\") pod \"mariadb-operator-controller-manager-c87fff755-s7scg\" (UID: \"8f795216-0196-4a5a-bfdf-20dee1543b43\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.271794 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52zcv\" (UniqueName: \"kubernetes.io/projected/b196e443-f058-49c2-b54b-a18656415f5a-kube-api-access-52zcv\") pod \"neutron-operator-controller-manager-cb4666565-hfwzv\" (UID: \"b196e443-f058-49c2-b54b-a18656415f5a\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.271810 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwk42\" (UniqueName: \"kubernetes.io/projected/fb5406b5-d194-441a-a098-7ecdc7831ec1-kube-api-access-zwk42\") pod \"nova-operator-controller-manager-65849867d6-prhbl\" (UID: \"fb5406b5-d194-441a-a098-7ecdc7831ec1\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.271833 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ncbpd\" (UniqueName: \"kubernetes.io/projected/ed6d3389-b374-42a6-8101-1d34df737170-kube-api-access-ncbpd\") pod \"manila-operator-controller-manager-864f6b75bf-mqjmm\" (UID: \"ed6d3389-b374-42a6-8101-1d34df737170\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.293049 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.308542 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm7vp\" (UniqueName: \"kubernetes.io/projected/8f795216-0196-4a5a-bfdf-20dee1543b43-kube-api-access-cm7vp\") pod \"mariadb-operator-controller-manager-c87fff755-s7scg\" (UID: \"8f795216-0196-4a5a-bfdf-20dee1543b43\") " pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.315002 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.329997 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.331420 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.334958 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ncbpd\" (UniqueName: \"kubernetes.io/projected/ed6d3389-b374-42a6-8101-1d34df737170-kube-api-access-ncbpd\") pod \"manila-operator-controller-manager-864f6b75bf-mqjmm\" (UID: \"ed6d3389-b374-42a6-8101-1d34df737170\") " pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.341336 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-tspl4" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.356527 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.357475 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.364747 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-t49mz" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.364938 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.378309 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9nm8\" (UniqueName: \"kubernetes.io/projected/ff53e5c0-255a-43c5-a27c-ce9dc3145999-kube-api-access-n9nm8\") pod \"octavia-operator-controller-manager-7fc9b76cf6-sslnl\" (UID: \"ff53e5c0-255a-43c5-a27c-ce9dc3145999\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.378430 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52zcv\" (UniqueName: \"kubernetes.io/projected/b196e443-f058-49c2-b54b-a18656415f5a-kube-api-access-52zcv\") pod \"neutron-operator-controller-manager-cb4666565-hfwzv\" (UID: \"b196e443-f058-49c2-b54b-a18656415f5a\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.378453 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwk42\" (UniqueName: \"kubernetes.io/projected/fb5406b5-d194-441a-a098-7ecdc7831ec1-kube-api-access-zwk42\") pod \"nova-operator-controller-manager-65849867d6-prhbl\" (UID: \"fb5406b5-d194-441a-a098-7ecdc7831ec1\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.382379 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.383858 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.390714 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-25cgj" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.398654 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.400887 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.406889 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.425089 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.434090 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9nm8\" (UniqueName: \"kubernetes.io/projected/ff53e5c0-255a-43c5-a27c-ce9dc3145999-kube-api-access-n9nm8\") pod \"octavia-operator-controller-manager-7fc9b76cf6-sslnl\" (UID: \"ff53e5c0-255a-43c5-a27c-ce9dc3145999\") " pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.434540 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52zcv\" (UniqueName: \"kubernetes.io/projected/b196e443-f058-49c2-b54b-a18656415f5a-kube-api-access-52zcv\") pod \"neutron-operator-controller-manager-cb4666565-hfwzv\" (UID: \"b196e443-f058-49c2-b54b-a18656415f5a\") " pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.435910 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.436764 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.443362 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwk42\" (UniqueName: \"kubernetes.io/projected/fb5406b5-d194-441a-a098-7ecdc7831ec1-kube-api-access-zwk42\") pod \"nova-operator-controller-manager-65849867d6-prhbl\" (UID: \"fb5406b5-d194-441a-a098-7ecdc7831ec1\") " pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.446658 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.462206 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.463528 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.466864 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-hx58c" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.476475 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-hd9xt" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.482112 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.482183 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccx9m\" (UniqueName: \"kubernetes.io/projected/7ed73202-faba-46ba-ae91-8cd9ffbe70a4-kube-api-access-ccx9m\") pod \"telemetry-operator-controller-manager-5f8f495fcf-2thqw\" (UID: \"7ed73202-faba-46ba-ae91-8cd9ffbe70a4\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.482213 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9f7z\" (UniqueName: \"kubernetes.io/projected/a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3-kube-api-access-q9f7z\") pod \"ovn-operator-controller-manager-55db956ddc-6ngwx\" (UID: \"a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.482241 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjqfd\" (UniqueName: \"kubernetes.io/projected/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-kube-api-access-fjqfd\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.482256 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bxh5\" (UniqueName: \"kubernetes.io/projected/9e235ee6-33ad-40e3-9b7a-914820315627-kube-api-access-5bxh5\") pod \"swift-operator-controller-manager-85dd56d4cc-t8tmg\" (UID: \"9e235ee6-33ad-40e3-9b7a-914820315627\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.482284 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxc76\" (UniqueName: \"kubernetes.io/projected/2601732b-921a-4c55-821b-0fc994c50236-kube-api-access-xxc76\") pod \"placement-operator-controller-manager-686df47fcb-26j8t\" (UID: \"2601732b-921a-4c55-821b-0fc994c50236\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.493447 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.494301 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.499591 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.500311 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.505357 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-5qzhx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.530278 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.546460 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.583827 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.583878 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.583919 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ccx9m\" (UniqueName: \"kubernetes.io/projected/7ed73202-faba-46ba-ae91-8cd9ffbe70a4-kube-api-access-ccx9m\") pod \"telemetry-operator-controller-manager-5f8f495fcf-2thqw\" (UID: \"7ed73202-faba-46ba-ae91-8cd9ffbe70a4\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.583986 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q9f7z\" (UniqueName: \"kubernetes.io/projected/a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3-kube-api-access-q9f7z\") pod \"ovn-operator-controller-manager-55db956ddc-6ngwx\" (UID: \"a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.584022 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjqfd\" (UniqueName: \"kubernetes.io/projected/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-kube-api-access-fjqfd\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.584040 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bxh5\" (UniqueName: \"kubernetes.io/projected/9e235ee6-33ad-40e3-9b7a-914820315627-kube-api-access-5bxh5\") pod \"swift-operator-controller-manager-85dd56d4cc-t8tmg\" (UID: \"9e235ee6-33ad-40e3-9b7a-914820315627\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.584068 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxc76\" (UniqueName: \"kubernetes.io/projected/2601732b-921a-4c55-821b-0fc994c50236-kube-api-access-xxc76\") pod \"placement-operator-controller-manager-686df47fcb-26j8t\" (UID: \"2601732b-921a-4c55-821b-0fc994c50236\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:02 crc kubenswrapper[4773]: E0120 18:46:02.584923 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:02 crc kubenswrapper[4773]: E0120 18:46:02.584981 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert podName:e9f6d4b3-c2cc-4cc6-b279-362e7439974b nodeName:}" failed. No retries permitted until 2026-01-20 18:46:03.084968149 +0000 UTC m=+956.006781173 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" (UID: "e9f6d4b3-c2cc-4cc6-b279-362e7439974b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:02 crc kubenswrapper[4773]: E0120 18:46:02.585119 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:02 crc kubenswrapper[4773]: E0120 18:46:02.585148 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert podName:437cadd4-5809-4b9e-afa2-05832cd6c303 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:03.585140244 +0000 UTC m=+956.506953268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert") pod "infra-operator-controller-manager-77c48c7859-hqsb9" (UID: "437cadd4-5809-4b9e-afa2-05832cd6c303") : secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.642719 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.652265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bxh5\" (UniqueName: \"kubernetes.io/projected/9e235ee6-33ad-40e3-9b7a-914820315627-kube-api-access-5bxh5\") pod \"swift-operator-controller-manager-85dd56d4cc-t8tmg\" (UID: \"9e235ee6-33ad-40e3-9b7a-914820315627\") " pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.657594 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxc76\" (UniqueName: \"kubernetes.io/projected/2601732b-921a-4c55-821b-0fc994c50236-kube-api-access-xxc76\") pod \"placement-operator-controller-manager-686df47fcb-26j8t\" (UID: \"2601732b-921a-4c55-821b-0fc994c50236\") " pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.663578 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9f7z\" (UniqueName: \"kubernetes.io/projected/a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3-kube-api-access-q9f7z\") pod \"ovn-operator-controller-manager-55db956ddc-6ngwx\" (UID: \"a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3\") " pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.665580 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjqfd\" (UniqueName: \"kubernetes.io/projected/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-kube-api-access-fjqfd\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.672617 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ccx9m\" (UniqueName: \"kubernetes.io/projected/7ed73202-faba-46ba-ae91-8cd9ffbe70a4-kube-api-access-ccx9m\") pod \"telemetry-operator-controller-manager-5f8f495fcf-2thqw\" (UID: \"7ed73202-faba-46ba-ae91-8cd9ffbe70a4\") " pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.684735 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.685721 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwhn7\" (UniqueName: \"kubernetes.io/projected/cfba823f-e85e-42ae-aa8a-7926cc906b92-kube-api-access-lwhn7\") pod \"test-operator-controller-manager-7f4549b895-p2vwt\" (UID: \"cfba823f-e85e-42ae-aa8a-7926cc906b92\") " pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.706861 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.739352 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.786793 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwhn7\" (UniqueName: \"kubernetes.io/projected/cfba823f-e85e-42ae-aa8a-7926cc906b92-kube-api-access-lwhn7\") pod \"test-operator-controller-manager-7f4549b895-p2vwt\" (UID: \"cfba823f-e85e-42ae-aa8a-7926cc906b92\") " pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.789846 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.791106 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.804410 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-rhtgx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.814400 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.832741 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwhn7\" (UniqueName: \"kubernetes.io/projected/cfba823f-e85e-42ae-aa8a-7926cc906b92-kube-api-access-lwhn7\") pod \"test-operator-controller-manager-7f4549b895-p2vwt\" (UID: \"cfba823f-e85e-42ae-aa8a-7926cc906b92\") " pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.848164 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.893189 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcl7c\" (UniqueName: \"kubernetes.io/projected/7f740208-043d-4d7f-b533-5526833d10c2-kube-api-access-lcl7c\") pod \"watcher-operator-controller-manager-64cd966744-nhqxg\" (UID: \"7f740208-043d-4d7f-b533-5526833d10c2\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.902483 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.949568 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.950524 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.953393 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-lx94s" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.954285 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.954406 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.958953 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.960075 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.961560 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.969337 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.970701 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.972619 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-2wj4c" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.973628 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.994987 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.995072 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv"] Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.995094 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26sx8\" (UniqueName: \"kubernetes.io/projected/99558a40-3dbc-4c2b-9aab-a085c7ef5c7c-kube-api-access-26sx8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r5qgh\" (UID: \"99558a40-3dbc-4c2b-9aab-a085c7ef5c7c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.995148 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcl7c\" (UniqueName: \"kubernetes.io/projected/7f740208-043d-4d7f-b533-5526833d10c2-kube-api-access-lcl7c\") pod \"watcher-operator-controller-manager-64cd966744-nhqxg\" (UID: \"7f740208-043d-4d7f-b533-5526833d10c2\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.995255 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:02 crc kubenswrapper[4773]: I0120 18:46:02.995827 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgkjx\" (UniqueName: \"kubernetes.io/projected/86d68359-5910-4d1d-8a01-2964f8d26464-kube-api-access-pgkjx\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.027769 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcl7c\" (UniqueName: \"kubernetes.io/projected/7f740208-043d-4d7f-b533-5526833d10c2-kube-api-access-lcl7c\") pod \"watcher-operator-controller-manager-64cd966744-nhqxg\" (UID: \"7f740208-043d-4d7f-b533-5526833d10c2\") " pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.029137 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.097358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.097433 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.097474 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26sx8\" (UniqueName: \"kubernetes.io/projected/99558a40-3dbc-4c2b-9aab-a085c7ef5c7c-kube-api-access-26sx8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r5qgh\" (UID: \"99558a40-3dbc-4c2b-9aab-a085c7ef5c7c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.097494 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.097532 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgkjx\" (UniqueName: \"kubernetes.io/projected/86d68359-5910-4d1d-8a01-2964f8d26464-kube-api-access-pgkjx\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.097990 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.098035 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert podName:e9f6d4b3-c2cc-4cc6-b279-362e7439974b nodeName:}" failed. No retries permitted until 2026-01-20 18:46:04.098020962 +0000 UTC m=+957.019833986 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" (UID: "e9f6d4b3-c2cc-4cc6-b279-362e7439974b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.098308 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.098339 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:03.598329399 +0000 UTC m=+956.520142423 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "metrics-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.098524 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.098551 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:03.598543594 +0000 UTC m=+956.520356618 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.128887 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgkjx\" (UniqueName: \"kubernetes.io/projected/86d68359-5910-4d1d-8a01-2964f8d26464-kube-api-access-pgkjx\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.136631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26sx8\" (UniqueName: \"kubernetes.io/projected/99558a40-3dbc-4c2b-9aab-a085c7ef5c7c-kube-api-access-26sx8\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r5qgh\" (UID: \"99558a40-3dbc-4c2b-9aab-a085c7ef5c7c\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.158634 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.168163 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.198058 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.213735 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.385392 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.407000 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f"] Jan 20 18:46:03 crc kubenswrapper[4773]: W0120 18:46:03.419609 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac02d392_7ff9_42e1_ad6f_47ab9f04a9a7.slice/crio-46702d5d2c2107aed9c9b1805c64748f7ed225d753bce50eb0376ab54557e413 WatchSource:0}: Error finding container 46702d5d2c2107aed9c9b1805c64748f7ed225d753bce50eb0376ab54557e413: Status 404 returned error can't find the container with id 46702d5d2c2107aed9c9b1805c64748f7ed225d753bce50eb0376ab54557e413 Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.601649 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.608510 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.608588 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.608714 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.608828 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.608897 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:04.608879492 +0000 UTC m=+957.530692516 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.608828 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.608828 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.609100 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:04.608980665 +0000 UTC m=+957.530793689 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "metrics-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.609126 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert podName:437cadd4-5809-4b9e-afa2-05832cd6c303 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:05.609118368 +0000 UTC m=+958.530931392 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert") pod "infra-operator-controller-manager-77c48c7859-hqsb9" (UID: "437cadd4-5809-4b9e-afa2-05832cd6c303") : secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.618847 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs"] Jan 20 18:46:03 crc kubenswrapper[4773]: W0120 18:46:03.619765 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb5406b5_d194_441a_a098_7ecdc7831ec1.slice/crio-bcd7f44e8872eac34a1e41b5926c3dd3179ad0b6c7c2d1118e935283c3861a0a WatchSource:0}: Error finding container bcd7f44e8872eac34a1e41b5926c3dd3179ad0b6c7c2d1118e935283c3861a0a: Status 404 returned error can't find the container with id bcd7f44e8872eac34a1e41b5926c3dd3179ad0b6c7c2d1118e935283c3861a0a Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.636363 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-65849867d6-prhbl"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.643588 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.655846 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.745632 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.752041 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl"] Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.761399 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx"] Jan 20 18:46:03 crc kubenswrapper[4773]: W0120 18:46:03.765996 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff53e5c0_255a_43c5_a27c_ce9dc3145999.slice/crio-c3d63801b583e271bb6e382ff82ccc2c42f1d4614dc1487775248cd0d1b2ae3d WatchSource:0}: Error finding container c3d63801b583e271bb6e382ff82ccc2c42f1d4614dc1487775248cd0d1b2ae3d: Status 404 returned error can't find the container with id c3d63801b583e271bb6e382ff82ccc2c42f1d4614dc1487775248cd0d1b2ae3d Jan 20 18:46:03 crc kubenswrapper[4773]: I0120 18:46:03.773779 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t"] Jan 20 18:46:03 crc kubenswrapper[4773]: W0120 18:46:03.780100 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1b3e0e3_f4c7_4b3d_9ba0_a198be108cb3.slice/crio-104afbe6488d5108feb16c2ccb0bd0ed8907ab8098ef87b0b85fdc9939698415 WatchSource:0}: Error finding container 104afbe6488d5108feb16c2ccb0bd0ed8907ab8098ef87b0b85fdc9939698415: Status 404 returned error can't find the container with id 104afbe6488d5108feb16c2ccb0bd0ed8907ab8098ef87b0b85fdc9939698415 Jan 20 18:46:03 crc kubenswrapper[4773]: W0120 18:46:03.782386 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2601732b_921a_4c55_821b_0fc994c50236.slice/crio-4f458285d9b4320f6b17dacc151f965112c35cce37ec4dd0260e8cf1d2fe3365 WatchSource:0}: Error finding container 4f458285d9b4320f6b17dacc151f965112c35cce37ec4dd0260e8cf1d2fe3365: Status 404 returned error can't find the container with id 4f458285d9b4320f6b17dacc151f965112c35cce37ec4dd0260e8cf1d2fe3365 Jan 20 18:46:03 crc kubenswrapper[4773]: W0120 18:46:03.782731 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfba823f_e85e_42ae_aa8a_7926cc906b92.slice/crio-ebcd025cf8f03f28f01ecd45ad4f939dd38f6c622ca7a159c0905d06c636fb07 WatchSource:0}: Error finding container ebcd025cf8f03f28f01ecd45ad4f939dd38f6c622ca7a159c0905d06c636fb07: Status 404 returned error can't find the container with id ebcd025cf8f03f28f01ecd45ad4f939dd38f6c622ca7a159c0905d06c636fb07 Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.785082 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-xxc76,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-686df47fcb-26j8t_openstack-operators(2601732b-921a-4c55-821b-0fc994c50236): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.785176 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:38.102.83.246:5001/openstack-k8s-operators/test-operator:d13a5aac38c8137de82b9d4aecf30e64d0d93ea1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lwhn7,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-7f4549b895-p2vwt_openstack-operators(cfba823f-e85e-42ae-aa8a-7926cc906b92): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.787023 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" podUID="2601732b-921a-4c55-821b-0fc994c50236" Jan 20 18:46:03 crc kubenswrapper[4773]: E0120 18:46:03.787025 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" podUID="cfba823f-e85e-42ae-aa8a-7926cc906b92" Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.015770 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw"] Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.023607 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm"] Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.024006 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" event={"ID":"4604c39e-62d8-4420-b2bc-54d44f4ebcd0","Type":"ContainerStarted","Data":"41b43fade328e74bc89e73feac7662785f6bfd1c4dba192806242b2b446e4d11"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.028480 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg"] Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.033044 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh"] Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.036278 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" event={"ID":"fb5406b5-d194-441a-a098-7ecdc7831ec1","Type":"ContainerStarted","Data":"bcd7f44e8872eac34a1e41b5926c3dd3179ad0b6c7c2d1118e935283c3861a0a"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.038145 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" event={"ID":"ff53e5c0-255a-43c5-a27c-ce9dc3145999","Type":"ContainerStarted","Data":"c3d63801b583e271bb6e382ff82ccc2c42f1d4614dc1487775248cd0d1b2ae3d"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.039383 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" event={"ID":"ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7","Type":"ContainerStarted","Data":"46702d5d2c2107aed9c9b1805c64748f7ed225d753bce50eb0376ab54557e413"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.040950 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg"] Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.041135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" event={"ID":"a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3","Type":"ContainerStarted","Data":"104afbe6488d5108feb16c2ccb0bd0ed8907ab8098ef87b0b85fdc9939698415"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.046057 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" event={"ID":"951d4f5c-5d89-41c6-be8a-9828b05ce182","Type":"ContainerStarted","Data":"278725e5154191a674c911b99a316ab6ebdb507f6308f26d7ea0c5b2ad857d67"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.047224 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" event={"ID":"2601732b-921a-4c55-821b-0fc994c50236","Type":"ContainerStarted","Data":"4f458285d9b4320f6b17dacc151f965112c35cce37ec4dd0260e8cf1d2fe3365"} Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.048800 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" podUID="2601732b-921a-4c55-821b-0fc994c50236" Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.049531 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" event={"ID":"df2d6d5b-b964-4672-903f-563b7792ee43","Type":"ContainerStarted","Data":"95982d054ce701f7f8592be8a6e0c47177f54e1fc0fab835aa6019e69a82e725"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.050957 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" event={"ID":"b773ecb8-3505-44ad-a28f-bd4054263888","Type":"ContainerStarted","Data":"dad345a1957fa5908f97e40e2038b2fbcc9e75759c47d727d7b071647d056eee"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.060639 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" event={"ID":"cfba823f-e85e-42ae-aa8a-7926cc906b92","Type":"ContainerStarted","Data":"ebcd025cf8f03f28f01ecd45ad4f939dd38f6c622ca7a159c0905d06c636fb07"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.062964 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" event={"ID":"d1051db2-8914-422b-a126-5cd8ee078767","Type":"ContainerStarted","Data":"3509bb73a7c494730ebd45c76ce96c5cb00819d68e87a248d553b4ffa82d261d"} Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.063231 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.246:5001/openstack-k8s-operators/test-operator:d13a5aac38c8137de82b9d4aecf30e64d0d93ea1\\\"\"" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" podUID="cfba823f-e85e-42ae-aa8a-7926cc906b92" Jan 20 18:46:04 crc kubenswrapper[4773]: W0120 18:46:04.065956 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e235ee6_33ad_40e3_9b7a_914820315627.slice/crio-3082f2e0d2c209abb673fe835dc31b65e5a0e80ce5f2eeec3767b3b3b4b096e9 WatchSource:0}: Error finding container 3082f2e0d2c209abb673fe835dc31b65e5a0e80ce5f2eeec3767b3b3b4b096e9: Status 404 returned error can't find the container with id 3082f2e0d2c209abb673fe835dc31b65e5a0e80ce5f2eeec3767b3b3b4b096e9 Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.075001 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-lcl7c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-64cd966744-nhqxg_openstack-operators(7f740208-043d-4d7f-b533-5526833d10c2): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.076222 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" podUID="7f740208-043d-4d7f-b533-5526833d10c2" Jan 20 18:46:04 crc kubenswrapper[4773]: W0120 18:46:04.076536 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99558a40_3dbc_4c2b_9aab_a085c7ef5c7c.slice/crio-6649e1c3ca064e1f96e4ae85f7c9f5405fee594d3fd1510aeee3a55f8f6ccfc1 WatchSource:0}: Error finding container 6649e1c3ca064e1f96e4ae85f7c9f5405fee594d3fd1510aeee3a55f8f6ccfc1: Status 404 returned error can't find the container with id 6649e1c3ca064e1f96e4ae85f7c9f5405fee594d3fd1510aeee3a55f8f6ccfc1 Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.078236 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" event={"ID":"a570d5a5-53f4-444f-a14d-92ea24f27e2e","Type":"ContainerStarted","Data":"7f64273351d7647a4d68a3135f9b84cd07b269a4bf2a98ff08b8aceabdb27798"} Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.083326 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-26sx8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-r5qgh_openstack-operators(99558a40-3dbc-4c2b-9aab-a085c7ef5c7c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.083502 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-5bxh5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-85dd56d4cc-t8tmg_openstack-operators(9e235ee6-33ad-40e3-9b7a-914820315627): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.092634 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" podUID="9e235ee6-33ad-40e3-9b7a-914820315627" Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.092789 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" podUID="99558a40-3dbc-4c2b-9aab-a085c7ef5c7c" Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.096892 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" event={"ID":"8f795216-0196-4a5a-bfdf-20dee1543b43","Type":"ContainerStarted","Data":"14c16a8d957ce49a27a61756659da008941d74137df88b9548b27ef4c11d5fc5"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.099129 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" event={"ID":"b196e443-f058-49c2-b54b-a18656415f5a","Type":"ContainerStarted","Data":"25121ab30e93fb5f1c8efdf266ec2c9d6784f7d622e1b4727383e3f887f544e6"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.102061 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" event={"ID":"48aacb32-c120-4f36-898b-60f5d01c5510","Type":"ContainerStarted","Data":"d29312ff4c8c6d40a8552f80ee25b874460ee3d1a8f31baf3a1233e9c9197fef"} Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.124624 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.124741 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.124804 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert podName:e9f6d4b3-c2cc-4cc6-b279-362e7439974b nodeName:}" failed. No retries permitted until 2026-01-20 18:46:06.124784272 +0000 UTC m=+959.046597296 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" (UID: "e9f6d4b3-c2cc-4cc6-b279-362e7439974b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.630878 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:04 crc kubenswrapper[4773]: I0120 18:46:04.631124 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.631282 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.631347 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:06.63132926 +0000 UTC m=+959.553142284 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "metrics-server-cert" not found Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.631352 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 18:46:04 crc kubenswrapper[4773]: E0120 18:46:04.631427 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:06.631409932 +0000 UTC m=+959.553222956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "webhook-server-cert" not found Jan 20 18:46:05 crc kubenswrapper[4773]: I0120 18:46:05.115692 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" event={"ID":"9e235ee6-33ad-40e3-9b7a-914820315627","Type":"ContainerStarted","Data":"3082f2e0d2c209abb673fe835dc31b65e5a0e80ce5f2eeec3767b3b3b4b096e9"} Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.120772 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" podUID="9e235ee6-33ad-40e3-9b7a-914820315627" Jan 20 18:46:05 crc kubenswrapper[4773]: I0120 18:46:05.127103 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" event={"ID":"7ed73202-faba-46ba-ae91-8cd9ffbe70a4","Type":"ContainerStarted","Data":"55ab43b01eb56ce4f10f20a1d0e5fd00bdbc0616311b8ee1b864cdbd5b1a70ce"} Jan 20 18:46:05 crc kubenswrapper[4773]: I0120 18:46:05.128389 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" event={"ID":"ed6d3389-b374-42a6-8101-1d34df737170","Type":"ContainerStarted","Data":"babb387495ae3fc73c7f583c1ad3f91c9a44dbbe25711303c9c4b55d12e8c204"} Jan 20 18:46:05 crc kubenswrapper[4773]: I0120 18:46:05.130679 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" event={"ID":"99558a40-3dbc-4c2b-9aab-a085c7ef5c7c","Type":"ContainerStarted","Data":"6649e1c3ca064e1f96e4ae85f7c9f5405fee594d3fd1510aeee3a55f8f6ccfc1"} Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.134460 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" podUID="99558a40-3dbc-4c2b-9aab-a085c7ef5c7c" Jan 20 18:46:05 crc kubenswrapper[4773]: I0120 18:46:05.139800 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" event={"ID":"7f740208-043d-4d7f-b533-5526833d10c2","Type":"ContainerStarted","Data":"ff84ca1c2d3d9a4637ef8c35d6498288cfbb0db6a6e92e7d20a80a7ac3039598"} Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.141163 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:146961cac3291daf96c1ca2bc7bd52bc94d1f4787a0770e23205c2c9beb0d737\\\"\"" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" podUID="2601732b-921a-4c55-821b-0fc994c50236" Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.142211 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" podUID="7f740208-043d-4d7f-b533-5526833d10c2" Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.155945 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"38.102.83.246:5001/openstack-k8s-operators/test-operator:d13a5aac38c8137de82b9d4aecf30e64d0d93ea1\\\"\"" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" podUID="cfba823f-e85e-42ae-aa8a-7926cc906b92" Jan 20 18:46:05 crc kubenswrapper[4773]: I0120 18:46:05.647946 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.649112 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:05 crc kubenswrapper[4773]: E0120 18:46:05.649168 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert podName:437cadd4-5809-4b9e-afa2-05832cd6c303 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:09.649151988 +0000 UTC m=+962.570965022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert") pod "infra-operator-controller-manager-77c48c7859-hqsb9" (UID: "437cadd4-5809-4b9e-afa2-05832cd6c303") : secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.149823 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:9404536bf7cb7c3818e1a0f92b53e4d7c02fe7942324f32894106f02f8fc7e92\\\"\"" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" podUID="9e235ee6-33ad-40e3-9b7a-914820315627" Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.151367 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" podUID="99558a40-3dbc-4c2b-9aab-a085c7ef5c7c" Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.151534 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d687150a46d97eb382dcd8305a2a611943af74771debe1fa9cc13a21e51c69ad\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" podUID="7f740208-043d-4d7f-b533-5526833d10c2" Jan 20 18:46:06 crc kubenswrapper[4773]: I0120 18:46:06.155604 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.155718 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.156134 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert podName:e9f6d4b3-c2cc-4cc6-b279-362e7439974b nodeName:}" failed. No retries permitted until 2026-01-20 18:46:10.156118345 +0000 UTC m=+963.077931389 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" (UID: "e9f6d4b3-c2cc-4cc6-b279-362e7439974b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:06 crc kubenswrapper[4773]: I0120 18:46:06.673237 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:06 crc kubenswrapper[4773]: I0120 18:46:06.673616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.674189 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.674331 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:10.674313109 +0000 UTC m=+963.596126133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "metrics-server-cert" not found Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.674427 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 18:46:06 crc kubenswrapper[4773]: E0120 18:46:06.674496 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:10.674481583 +0000 UTC m=+963.596294607 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "webhook-server-cert" not found Jan 20 18:46:09 crc kubenswrapper[4773]: I0120 18:46:09.721440 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:09 crc kubenswrapper[4773]: E0120 18:46:09.721637 4773 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:09 crc kubenswrapper[4773]: E0120 18:46:09.722123 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert podName:437cadd4-5809-4b9e-afa2-05832cd6c303 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:17.722100409 +0000 UTC m=+970.643913433 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert") pod "infra-operator-controller-manager-77c48c7859-hqsb9" (UID: "437cadd4-5809-4b9e-afa2-05832cd6c303") : secret "infra-operator-webhook-server-cert" not found Jan 20 18:46:10 crc kubenswrapper[4773]: I0120 18:46:10.228364 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:10 crc kubenswrapper[4773]: E0120 18:46:10.228516 4773 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:10 crc kubenswrapper[4773]: E0120 18:46:10.228564 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert podName:e9f6d4b3-c2cc-4cc6-b279-362e7439974b nodeName:}" failed. No retries permitted until 2026-01-20 18:46:18.228549604 +0000 UTC m=+971.150362628 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert") pod "openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" (UID: "e9f6d4b3-c2cc-4cc6-b279-362e7439974b") : secret "openstack-baremetal-operator-webhook-server-cert" not found Jan 20 18:46:10 crc kubenswrapper[4773]: I0120 18:46:10.736214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:10 crc kubenswrapper[4773]: I0120 18:46:10.736349 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:10 crc kubenswrapper[4773]: E0120 18:46:10.736387 4773 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Jan 20 18:46:10 crc kubenswrapper[4773]: E0120 18:46:10.736464 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:18.736443793 +0000 UTC m=+971.658256867 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "webhook-server-cert" not found Jan 20 18:46:10 crc kubenswrapper[4773]: E0120 18:46:10.736477 4773 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Jan 20 18:46:10 crc kubenswrapper[4773]: E0120 18:46:10.736521 4773 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs podName:86d68359-5910-4d1d-8a01-2964f8d26464 nodeName:}" failed. No retries permitted until 2026-01-20 18:46:18.736508195 +0000 UTC m=+971.658321219 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs") pod "openstack-operator-controller-manager-674cd49df-nnf4r" (UID: "86d68359-5910-4d1d-8a01-2964f8d26464") : secret "metrics-server-cert" not found Jan 20 18:46:17 crc kubenswrapper[4773]: I0120 18:46:17.738859 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:17 crc kubenswrapper[4773]: I0120 18:46:17.746112 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/437cadd4-5809-4b9e-afa2-05832cd6c303-cert\") pod \"infra-operator-controller-manager-77c48c7859-hqsb9\" (UID: \"437cadd4-5809-4b9e-afa2-05832cd6c303\") " pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:17 crc kubenswrapper[4773]: I0120 18:46:17.875415 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-hr4gt" Jan 20 18:46:17 crc kubenswrapper[4773]: I0120 18:46:17.883991 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.246623 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.253143 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9f6d4b3-c2cc-4cc6-b279-362e7439974b-cert\") pod \"openstack-baremetal-operator-controller-manager-6b68b8b854m7knp\" (UID: \"e9f6d4b3-c2cc-4cc6-b279-362e7439974b\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.288718 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-t49mz" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.297599 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.753573 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.754649 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.760093 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-webhook-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:18 crc kubenswrapper[4773]: I0120 18:46:18.760662 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/86d68359-5910-4d1d-8a01-2964f8d26464-metrics-certs\") pod \"openstack-operator-controller-manager-674cd49df-nnf4r\" (UID: \"86d68359-5910-4d1d-8a01-2964f8d26464\") " pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:19 crc kubenswrapper[4773]: I0120 18:46:19.053966 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-lx94s" Jan 20 18:46:19 crc kubenswrapper[4773]: I0120 18:46:19.062986 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:21 crc kubenswrapper[4773]: E0120 18:46:21.812423 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843" Jan 20 18:46:21 crc kubenswrapper[4773]: E0120 18:46:21.812912 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ccx9m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-5f8f495fcf-2thqw_openstack-operators(7ed73202-faba-46ba-ae91-8cd9ffbe70a4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:21 crc kubenswrapper[4773]: E0120 18:46:21.814127 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" podUID="7ed73202-faba-46ba-ae91-8cd9ffbe70a4" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.267198 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:2e89109f5db66abf1afd15ef59bda35a53db40c5e59e020579ac5aa0acea1843\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" podUID="7ed73202-faba-46ba-ae91-8cd9ffbe70a4" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.310798 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.311047 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-cm7vp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-c87fff755-s7scg_openstack-operators(8f795216-0196-4a5a-bfdf-20dee1543b43): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.312688 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" podUID="8f795216-0196-4a5a-bfdf-20dee1543b43" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.997900 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.998094 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mf6fs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-9b68f5989-xmljc_openstack-operators(48aacb32-c120-4f36-898b-60f5d01c5510): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:22 crc kubenswrapper[4773]: E0120 18:46:22.999239 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" podUID="48aacb32-c120-4f36-898b-60f5d01c5510" Jan 20 18:46:23 crc kubenswrapper[4773]: E0120 18:46:23.273867 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:ff0b6c27e2d96afccd73fbbb5b5297a3f60c7f4f1dfd2a877152466697018d71\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" podUID="8f795216-0196-4a5a-bfdf-20dee1543b43" Jan 20 18:46:23 crc kubenswrapper[4773]: E0120 18:46:23.273922 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:ddb59f1a8e3fd0d641405e371e33b3d8c913af08e40e84f390e7e06f0a7f3488\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" podUID="48aacb32-c120-4f36-898b-60f5d01c5510" Jan 20 18:46:23 crc kubenswrapper[4773]: E0120 18:46:23.616671 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525" Jan 20 18:46:23 crc kubenswrapper[4773]: E0120 18:46:23.616869 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-k7cd8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ironic-operator-controller-manager-78757b4889-2nhdr_openstack-operators(951d4f5c-5d89-41c6-be8a-9828b05ce182): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:23 crc kubenswrapper[4773]: E0120 18:46:23.618265 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" podUID="951d4f5c-5d89-41c6-be8a-9828b05ce182" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.277922 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ironic-operator@sha256:56c5f8b78445b3dbfc0d5afd9312906f6bef4dccf67302b0e4e5ca20bd263525\\\"\"" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" podUID="951d4f5c-5d89-41c6-be8a-9828b05ce182" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.289213 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.289369 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bqbnd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-operator-controller-manager-7ddb5c749-hhxlp_openstack-operators(df2d6d5b-b964-4672-903f-563b7792ee43): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.290522 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" podUID="df2d6d5b-b964-4672-903f-563b7792ee43" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.882165 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.882318 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pbzzl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-767fdc4f47-8tsjs_openstack-operators(b773ecb8-3505-44ad-a28f-bd4054263888): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:24 crc kubenswrapper[4773]: E0120 18:46:24.884056 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" podUID="b773ecb8-3505-44ad-a28f-bd4054263888" Jan 20 18:46:25 crc kubenswrapper[4773]: E0120 18:46:25.283344 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/barbican-operator@sha256:f0634d8cf7c2c2919ca248a6883ce43d6ae4ac59252c987a5cfe17643fe7d38a\\\"\"" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" podUID="df2d6d5b-b964-4672-903f-563b7792ee43" Jan 20 18:46:25 crc kubenswrapper[4773]: E0120 18:46:25.283390 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:393d7567eef4fd05af625389f5a7384c6bb75108b21b06183f1f5e33aac5417e\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" podUID="b773ecb8-3505-44ad-a28f-bd4054263888" Jan 20 18:46:25 crc kubenswrapper[4773]: E0120 18:46:25.725496 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231" Jan 20 18:46:25 crc kubenswrapper[4773]: E0120 18:46:25.725735 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zwk42,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-65849867d6-prhbl_openstack-operators(fb5406b5-d194-441a-a098-7ecdc7831ec1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:46:25 crc kubenswrapper[4773]: E0120 18:46:25.727003 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" podUID="fb5406b5-d194-441a-a098-7ecdc7831ec1" Jan 20 18:46:26 crc kubenswrapper[4773]: E0120 18:46:26.289985 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:6defa56fc6a5bfbd5b27d28ff7b1c7bc89b24b2ef956e2a6d97b2726f668a231\\\"\"" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" podUID="fb5406b5-d194-441a-a098-7ecdc7831ec1" Jan 20 18:46:28 crc kubenswrapper[4773]: I0120 18:46:28.171041 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:46:28 crc kubenswrapper[4773]: I0120 18:46:28.171111 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:46:28 crc kubenswrapper[4773]: I0120 18:46:28.171156 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:46:28 crc kubenswrapper[4773]: I0120 18:46:28.297946 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"89086664f3aacadd154bb5a0e821ec93e674c41f0d2d3c8a5f423e5e3f0c2f57"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:46:28 crc kubenswrapper[4773]: I0120 18:46:28.298018 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://89086664f3aacadd154bb5a0e821ec93e674c41f0d2d3c8a5f423e5e3f0c2f57" gracePeriod=600 Jan 20 18:46:29 crc kubenswrapper[4773]: I0120 18:46:29.310087 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="89086664f3aacadd154bb5a0e821ec93e674c41f0d2d3c8a5f423e5e3f0c2f57" exitCode=0 Jan 20 18:46:29 crc kubenswrapper[4773]: I0120 18:46:29.310157 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"89086664f3aacadd154bb5a0e821ec93e674c41f0d2d3c8a5f423e5e3f0c2f57"} Jan 20 18:46:29 crc kubenswrapper[4773]: I0120 18:46:29.310476 4773 scope.go:117] "RemoveContainer" containerID="714571a77485b95d4127b785d445e091e7d21c1d67336a6816b862641584bfce" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.079047 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9"] Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.088368 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp"] Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.190159 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r"] Jan 20 18:46:30 crc kubenswrapper[4773]: W0120 18:46:30.224869 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86d68359_5910_4d1d_8a01_2964f8d26464.slice/crio-548b7a8a9c30cda256ec466c3d46a9f337247290ba3b043e891cbcde50b18d1a WatchSource:0}: Error finding container 548b7a8a9c30cda256ec466c3d46a9f337247290ba3b043e891cbcde50b18d1a: Status 404 returned error can't find the container with id 548b7a8a9c30cda256ec466c3d46a9f337247290ba3b043e891cbcde50b18d1a Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.324484 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" event={"ID":"e9f6d4b3-c2cc-4cc6-b279-362e7439974b","Type":"ContainerStarted","Data":"d56e8b5b597fef7841cadf1d13a05a4c67e6fed5e6ed84c9a6df6b41f14f9026"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.329922 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" event={"ID":"ed6d3389-b374-42a6-8101-1d34df737170","Type":"ContainerStarted","Data":"225ab666fcfe64aff32960e04fa9499c060c2a0be739be8c6d0c67a558ef1133"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.330174 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.334022 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" event={"ID":"ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7","Type":"ContainerStarted","Data":"f6f41745989aefdfb6be4b9256e04a5decc4695f79328abd6ac55c693ed8a6ae"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.334106 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.336415 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" event={"ID":"4604c39e-62d8-4420-b2bc-54d44f4ebcd0","Type":"ContainerStarted","Data":"5be1ce2b89dd09127c727aace2e25dc991ca2928376f4a720ec9b132776ab527"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.336524 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.337882 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" event={"ID":"437cadd4-5809-4b9e-afa2-05832cd6c303","Type":"ContainerStarted","Data":"a2783b95c2c86a5c744bdbb311ab64640765bd836f494b66f358c11cf764eaba"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.342756 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" event={"ID":"86d68359-5910-4d1d-8a01-2964f8d26464","Type":"ContainerStarted","Data":"548b7a8a9c30cda256ec466c3d46a9f337247290ba3b043e891cbcde50b18d1a"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.346429 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" event={"ID":"d1051db2-8914-422b-a126-5cd8ee078767","Type":"ContainerStarted","Data":"5fecf636083a07b12876b80526124ebde6af250ad5c582b13271ec117ae015f2"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.346575 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.347893 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" event={"ID":"b196e443-f058-49c2-b54b-a18656415f5a","Type":"ContainerStarted","Data":"0b620f20649ee67c6a6e1be9ee57511052c83baef2aab5eed463c75243ed6480"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.348056 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.348994 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" event={"ID":"a570d5a5-53f4-444f-a14d-92ea24f27e2e","Type":"ContainerStarted","Data":"1da4e4f867251675e510db96e5840f214b1723ced50a5ab49b04febaaca85c8a"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.349115 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.352117 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" event={"ID":"ff53e5c0-255a-43c5-a27c-ce9dc3145999","Type":"ContainerStarted","Data":"6080865f8e2aa467bd0c9359aa69b2a87efb198c309601fcd851d09717e11418"} Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.357893 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" podStartSLOduration=6.719818994 podStartE2EDuration="29.357873754s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:04.063763855 +0000 UTC m=+956.985576929" lastFinishedPulling="2026-01-20 18:46:26.701818665 +0000 UTC m=+979.623631689" observedRunningTime="2026-01-20 18:46:30.354871532 +0000 UTC m=+983.276684556" watchObservedRunningTime="2026-01-20 18:46:30.357873754 +0000 UTC m=+983.279686778" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.416257 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" podStartSLOduration=6.9836091190000005 podStartE2EDuration="29.416240741s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.276008415 +0000 UTC m=+956.197821439" lastFinishedPulling="2026-01-20 18:46:25.708640037 +0000 UTC m=+978.630453061" observedRunningTime="2026-01-20 18:46:30.382667902 +0000 UTC m=+983.304480926" watchObservedRunningTime="2026-01-20 18:46:30.416240741 +0000 UTC m=+983.338053755" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.489076 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" podStartSLOduration=7.399644886 podStartE2EDuration="29.489054336s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.619217157 +0000 UTC m=+956.541030181" lastFinishedPulling="2026-01-20 18:46:25.708626607 +0000 UTC m=+978.630439631" observedRunningTime="2026-01-20 18:46:30.487480139 +0000 UTC m=+983.409293163" watchObservedRunningTime="2026-01-20 18:46:30.489054336 +0000 UTC m=+983.410867360" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.493032 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" podStartSLOduration=6.216964895 podStartE2EDuration="29.493019932s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.424863976 +0000 UTC m=+956.346677000" lastFinishedPulling="2026-01-20 18:46:26.700919023 +0000 UTC m=+979.622732037" observedRunningTime="2026-01-20 18:46:30.415841242 +0000 UTC m=+983.337654266" watchObservedRunningTime="2026-01-20 18:46:30.493019932 +0000 UTC m=+983.414832956" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.521549 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" podStartSLOduration=7.672889035 podStartE2EDuration="29.521529129s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.056131328 +0000 UTC m=+955.977944352" lastFinishedPulling="2026-01-20 18:46:24.904771422 +0000 UTC m=+977.826584446" observedRunningTime="2026-01-20 18:46:30.518042675 +0000 UTC m=+983.439855729" watchObservedRunningTime="2026-01-20 18:46:30.521529129 +0000 UTC m=+983.443342153" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.556981 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" podStartSLOduration=6.988478684 podStartE2EDuration="29.556926722s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.652499487 +0000 UTC m=+956.574312511" lastFinishedPulling="2026-01-20 18:46:26.220947525 +0000 UTC m=+979.142760549" observedRunningTime="2026-01-20 18:46:30.552450244 +0000 UTC m=+983.474263278" watchObservedRunningTime="2026-01-20 18:46:30.556926722 +0000 UTC m=+983.478739756" Jan 20 18:46:30 crc kubenswrapper[4773]: I0120 18:46:30.580525 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" podStartSLOduration=6.656496206 podStartE2EDuration="29.5804974s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.77696193 +0000 UTC m=+956.698774954" lastFinishedPulling="2026-01-20 18:46:26.700963124 +0000 UTC m=+979.622776148" observedRunningTime="2026-01-20 18:46:30.575790447 +0000 UTC m=+983.497603481" watchObservedRunningTime="2026-01-20 18:46:30.5804974 +0000 UTC m=+983.502310424" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.360999 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" event={"ID":"cfba823f-e85e-42ae-aa8a-7926cc906b92","Type":"ContainerStarted","Data":"bc13a0347bc52acd4845ee6cc4777bc12ce0bb7f8eadf3b1a0a2a82d784dc290"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.361613 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.363538 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" event={"ID":"86d68359-5910-4d1d-8a01-2964f8d26464","Type":"ContainerStarted","Data":"1deca38d6d72d40be14a569795e68e007d3dcdda4e78313d9283207d77f70799"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.363595 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.365325 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" event={"ID":"a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3","Type":"ContainerStarted","Data":"8334e1984196bd8a090094cad622f282087ee531b33d8876d7af07043fffa4de"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.365759 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.367762 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" event={"ID":"2601732b-921a-4c55-821b-0fc994c50236","Type":"ContainerStarted","Data":"375aba268785c6d71dcab62d1c74b08c423f474ddab1b9a1cb4981db15c5b193"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.367984 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.370208 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" event={"ID":"99558a40-3dbc-4c2b-9aab-a085c7ef5c7c","Type":"ContainerStarted","Data":"0edd9a3f79b3873858cda3701083f1570e467a94838bca4e64e982516bad63ba"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.371879 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" event={"ID":"7f740208-043d-4d7f-b533-5526833d10c2","Type":"ContainerStarted","Data":"b7f8b9cf6e95c1003d0b950cf31f767ec177bad7e0c534fd0918ac61fd6d681b"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.372065 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.374558 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"f170dc7a6e6cd01e0186e6b45c72b1bd89b3220f96cf1e35088901106c87b344"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.380506 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" event={"ID":"9e235ee6-33ad-40e3-9b7a-914820315627","Type":"ContainerStarted","Data":"82421a1bdb62a10ce5637d5aa7bee9a1985a6689ac33ef12ae9a27483c3ad306"} Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.381678 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.383013 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" podStartSLOduration=3.4177183429999998 podStartE2EDuration="29.382992932s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.785078733 +0000 UTC m=+956.706891757" lastFinishedPulling="2026-01-20 18:46:29.750353322 +0000 UTC m=+982.672166346" observedRunningTime="2026-01-20 18:46:31.37790796 +0000 UTC m=+984.299720974" watchObservedRunningTime="2026-01-20 18:46:31.382992932 +0000 UTC m=+984.304805976" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.402883 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" podStartSLOduration=3.548883316 podStartE2EDuration="29.402860032s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.784904469 +0000 UTC m=+956.706717493" lastFinishedPulling="2026-01-20 18:46:29.638881185 +0000 UTC m=+982.560694209" observedRunningTime="2026-01-20 18:46:31.398514387 +0000 UTC m=+984.320327411" watchObservedRunningTime="2026-01-20 18:46:31.402860032 +0000 UTC m=+984.324673066" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.437593 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" podStartSLOduration=6.998713064 podStartE2EDuration="29.437570467s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.782128233 +0000 UTC m=+956.703941257" lastFinishedPulling="2026-01-20 18:46:26.220985636 +0000 UTC m=+979.142798660" observedRunningTime="2026-01-20 18:46:31.435267653 +0000 UTC m=+984.357080687" watchObservedRunningTime="2026-01-20 18:46:31.437570467 +0000 UTC m=+984.359383491" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.454753 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r5qgh" podStartSLOduration=3.786520021 podStartE2EDuration="29.454737702s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:04.083188606 +0000 UTC m=+957.005001640" lastFinishedPulling="2026-01-20 18:46:29.751406297 +0000 UTC m=+982.673219321" observedRunningTime="2026-01-20 18:46:31.449594748 +0000 UTC m=+984.371407772" watchObservedRunningTime="2026-01-20 18:46:31.454737702 +0000 UTC m=+984.376550726" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.502525 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" podStartSLOduration=3.944802189 podStartE2EDuration="29.502504573s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:04.074806747 +0000 UTC m=+956.996619771" lastFinishedPulling="2026-01-20 18:46:29.632509131 +0000 UTC m=+982.554322155" observedRunningTime="2026-01-20 18:46:31.502204796 +0000 UTC m=+984.424017820" watchObservedRunningTime="2026-01-20 18:46:31.502504573 +0000 UTC m=+984.424317597" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.531846 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" podStartSLOduration=29.5318283 podStartE2EDuration="29.5318283s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:46:31.530344704 +0000 UTC m=+984.452157748" watchObservedRunningTime="2026-01-20 18:46:31.5318283 +0000 UTC m=+984.453641324" Jan 20 18:46:31 crc kubenswrapper[4773]: I0120 18:46:31.569920 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" podStartSLOduration=4.021214916 podStartE2EDuration="29.569900967s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:04.083222396 +0000 UTC m=+957.005035420" lastFinishedPulling="2026-01-20 18:46:29.631908447 +0000 UTC m=+982.553721471" observedRunningTime="2026-01-20 18:46:31.55587837 +0000 UTC m=+984.477691394" watchObservedRunningTime="2026-01-20 18:46:31.569900967 +0000 UTC m=+984.491713991" Jan 20 18:46:32 crc kubenswrapper[4773]: I0120 18:46:32.849109 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:34 crc kubenswrapper[4773]: I0120 18:46:34.400509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" event={"ID":"437cadd4-5809-4b9e-afa2-05832cd6c303","Type":"ContainerStarted","Data":"6fa4b3f2f0eb9922713b5f5a37d8ae0ce20978ec654c02623263a1dc454e69e8"} Jan 20 18:46:34 crc kubenswrapper[4773]: I0120 18:46:34.401167 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:34 crc kubenswrapper[4773]: I0120 18:46:34.402200 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" event={"ID":"e9f6d4b3-c2cc-4cc6-b279-362e7439974b","Type":"ContainerStarted","Data":"864a35a9d532a1975371bfde3eaffbfa11f32981c9282da45666262860d4a236"} Jan 20 18:46:34 crc kubenswrapper[4773]: I0120 18:46:34.402346 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:34 crc kubenswrapper[4773]: I0120 18:46:34.419841 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" podStartSLOduration=29.844856372 podStartE2EDuration="33.419823067s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:30.098326919 +0000 UTC m=+983.020139943" lastFinishedPulling="2026-01-20 18:46:33.673293614 +0000 UTC m=+986.595106638" observedRunningTime="2026-01-20 18:46:34.413800252 +0000 UTC m=+987.335613276" watchObservedRunningTime="2026-01-20 18:46:34.419823067 +0000 UTC m=+987.341636091" Jan 20 18:46:34 crc kubenswrapper[4773]: I0120 18:46:34.439798 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" podStartSLOduration=28.864817715 podStartE2EDuration="32.439776299s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:30.091470784 +0000 UTC m=+983.013283808" lastFinishedPulling="2026-01-20 18:46:33.666429368 +0000 UTC m=+986.588242392" observedRunningTime="2026-01-20 18:46:34.435060794 +0000 UTC m=+987.356873828" watchObservedRunningTime="2026-01-20 18:46:34.439776299 +0000 UTC m=+987.361589333" Jan 20 18:46:36 crc kubenswrapper[4773]: I0120 18:46:36.415732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" event={"ID":"951d4f5c-5d89-41c6-be8a-9828b05ce182","Type":"ContainerStarted","Data":"03e2dc473dc5f09fb2c707e445239f85f4ce48c3b1ee5013945835a1093f07b4"} Jan 20 18:46:36 crc kubenswrapper[4773]: I0120 18:46:36.416514 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:36 crc kubenswrapper[4773]: I0120 18:46:36.417084 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" event={"ID":"48aacb32-c120-4f36-898b-60f5d01c5510","Type":"ContainerStarted","Data":"c4bdc5bc988df23d572ffe2a4c4fdc33391e7fcea5cde31dbfeb47452fb5d145"} Jan 20 18:46:36 crc kubenswrapper[4773]: I0120 18:46:36.417230 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:36 crc kubenswrapper[4773]: I0120 18:46:36.435088 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" podStartSLOduration=2.973644444 podStartE2EDuration="35.43507479s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.396559395 +0000 UTC m=+956.318372419" lastFinishedPulling="2026-01-20 18:46:35.857989741 +0000 UTC m=+988.779802765" observedRunningTime="2026-01-20 18:46:36.432499108 +0000 UTC m=+989.354312132" watchObservedRunningTime="2026-01-20 18:46:36.43507479 +0000 UTC m=+989.356887814" Jan 20 18:46:36 crc kubenswrapper[4773]: I0120 18:46:36.450839 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" podStartSLOduration=2.628659456 podStartE2EDuration="35.450818699s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.085853393 +0000 UTC m=+956.007666417" lastFinishedPulling="2026-01-20 18:46:35.908012636 +0000 UTC m=+988.829825660" observedRunningTime="2026-01-20 18:46:36.447741995 +0000 UTC m=+989.369555039" watchObservedRunningTime="2026-01-20 18:46:36.450818699 +0000 UTC m=+989.372631733" Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.305041 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-6b68b8b854m7knp" Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.429127 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" event={"ID":"8f795216-0196-4a5a-bfdf-20dee1543b43","Type":"ContainerStarted","Data":"c6084c26d58eaddd5ce92798e6ac9ed4fbe002dead88e2e8f0194fe16a248954"} Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.429281 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.431144 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" event={"ID":"7ed73202-faba-46ba-ae91-8cd9ffbe70a4","Type":"ContainerStarted","Data":"ad668968f8e3a57e062edbde61a9e295d5400b9f8de38d07f993ea4f0306400b"} Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.431359 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.451555 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" podStartSLOduration=3.041617408 podStartE2EDuration="37.451539362s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.64922937 +0000 UTC m=+956.571042394" lastFinishedPulling="2026-01-20 18:46:38.059151324 +0000 UTC m=+990.980964348" observedRunningTime="2026-01-20 18:46:38.445776463 +0000 UTC m=+991.367589507" watchObservedRunningTime="2026-01-20 18:46:38.451539362 +0000 UTC m=+991.373352386" Jan 20 18:46:38 crc kubenswrapper[4773]: I0120 18:46:38.462885 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" podStartSLOduration=2.550823273 podStartE2EDuration="36.462870895s" podCreationTimestamp="2026-01-20 18:46:02 +0000 UTC" firstStartedPulling="2026-01-20 18:46:04.041761912 +0000 UTC m=+956.963574936" lastFinishedPulling="2026-01-20 18:46:37.953809534 +0000 UTC m=+990.875622558" observedRunningTime="2026-01-20 18:46:38.461812879 +0000 UTC m=+991.383625893" watchObservedRunningTime="2026-01-20 18:46:38.462870895 +0000 UTC m=+991.384683909" Jan 20 18:46:39 crc kubenswrapper[4773]: I0120 18:46:39.069123 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-674cd49df-nnf4r" Jan 20 18:46:39 crc kubenswrapper[4773]: I0120 18:46:39.441611 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" event={"ID":"b773ecb8-3505-44ad-a28f-bd4054263888","Type":"ContainerStarted","Data":"70d604bf9ca953013dee36c6a8d67d0164744056953dcfcc1b8e7a8abb489b53"} Jan 20 18:46:39 crc kubenswrapper[4773]: I0120 18:46:39.442881 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:39 crc kubenswrapper[4773]: I0120 18:46:39.474551 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" podStartSLOduration=3.206588806 podStartE2EDuration="38.474535009s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.619095765 +0000 UTC m=+956.540908789" lastFinishedPulling="2026-01-20 18:46:38.887041968 +0000 UTC m=+991.808854992" observedRunningTime="2026-01-20 18:46:39.473199306 +0000 UTC m=+992.395012330" watchObservedRunningTime="2026-01-20 18:46:39.474535009 +0000 UTC m=+992.396348033" Jan 20 18:46:41 crc kubenswrapper[4773]: I0120 18:46:41.461018 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" event={"ID":"df2d6d5b-b964-4672-903f-563b7792ee43","Type":"ContainerStarted","Data":"a3a3434b0caf975b7fcb78687b73fe7fee584419559774a3ebb782d782923b81"} Jan 20 18:46:41 crc kubenswrapper[4773]: I0120 18:46:41.462077 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:41 crc kubenswrapper[4773]: I0120 18:46:41.487781 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" podStartSLOduration=2.799116823 podStartE2EDuration="40.487759402s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.221997483 +0000 UTC m=+956.143810507" lastFinishedPulling="2026-01-20 18:46:40.910640062 +0000 UTC m=+993.832453086" observedRunningTime="2026-01-20 18:46:41.484498823 +0000 UTC m=+994.406311847" watchObservedRunningTime="2026-01-20 18:46:41.487759402 +0000 UTC m=+994.409572426" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.026395 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-9b68f5989-xmljc" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.056460 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-9f958b845-v4q7f" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.122790 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-c6994669c-4kk2r" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.171109 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-77d5c5b54f-blxqv" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.297109 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-78757b4889-2nhdr" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.403598 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-594c8c9d5d-vjfdq" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.463830 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" event={"ID":"fb5406b5-d194-441a-a098-7ecdc7831ec1","Type":"ContainerStarted","Data":"84f6d7fcfe0f1cb518d89aae5d096c9c0c34986193c2d2763244de1d4b1b9b16"} Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.464068 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.481913 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" podStartSLOduration=3.202728939 podStartE2EDuration="41.481896833s" podCreationTimestamp="2026-01-20 18:46:01 +0000 UTC" firstStartedPulling="2026-01-20 18:46:03.626724116 +0000 UTC m=+956.548537140" lastFinishedPulling="2026-01-20 18:46:41.90589201 +0000 UTC m=+994.827705034" observedRunningTime="2026-01-20 18:46:42.479898985 +0000 UTC m=+995.401712009" watchObservedRunningTime="2026-01-20 18:46:42.481896833 +0000 UTC m=+995.403709857" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.502956 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-cb4666565-hfwzv" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.533658 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-7fc9b76cf6-sslnl" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.647111 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-864f6b75bf-mqjmm" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.711439 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-686df47fcb-26j8t" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.852634 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-85dd56d4cc-t8tmg" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.962739 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-7f4549b895-p2vwt" Jan 20 18:46:42 crc kubenswrapper[4773]: I0120 18:46:42.964037 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-55db956ddc-6ngwx" Jan 20 18:46:43 crc kubenswrapper[4773]: I0120 18:46:43.161297 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-64cd966744-nhqxg" Jan 20 18:46:47 crc kubenswrapper[4773]: I0120 18:46:47.889906 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-77c48c7859-hqsb9" Jan 20 18:46:52 crc kubenswrapper[4773]: I0120 18:46:52.040067 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-7ddb5c749-hhxlp" Jan 20 18:46:52 crc kubenswrapper[4773]: I0120 18:46:52.318526 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-767fdc4f47-8tsjs" Jan 20 18:46:52 crc kubenswrapper[4773]: I0120 18:46:52.503722 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-c87fff755-s7scg" Jan 20 18:46:52 crc kubenswrapper[4773]: I0120 18:46:52.550114 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-65849867d6-prhbl" Jan 20 18:46:52 crc kubenswrapper[4773]: I0120 18:46:52.906787 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5f8f495fcf-2thqw" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.610111 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2xgbd"] Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.639198 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2xgbd"] Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.639657 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.645982 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.646388 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.646721 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.646862 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-qw675" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.747410 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zb8x4"] Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.748500 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.754258 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.756688 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tpc9\" (UniqueName: \"kubernetes.io/projected/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-kube-api-access-9tpc9\") pod \"dnsmasq-dns-675f4bcbfc-2xgbd\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.756807 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-config\") pod \"dnsmasq-dns-675f4bcbfc-2xgbd\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.769800 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zb8x4"] Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.857641 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-config\") pod \"dnsmasq-dns-675f4bcbfc-2xgbd\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.857700 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tpc9\" (UniqueName: \"kubernetes.io/projected/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-kube-api-access-9tpc9\") pod \"dnsmasq-dns-675f4bcbfc-2xgbd\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.857770 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-config\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.857789 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.858167 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5rt2\" (UniqueName: \"kubernetes.io/projected/2dfd387a-7b46-44b7-8aed-53e919c99903-kube-api-access-n5rt2\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.859056 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-config\") pod \"dnsmasq-dns-675f4bcbfc-2xgbd\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.882620 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tpc9\" (UniqueName: \"kubernetes.io/projected/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-kube-api-access-9tpc9\") pod \"dnsmasq-dns-675f4bcbfc-2xgbd\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.958967 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5rt2\" (UniqueName: \"kubernetes.io/projected/2dfd387a-7b46-44b7-8aed-53e919c99903-kube-api-access-n5rt2\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.959338 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-config\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.959446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.960430 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-config\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.960460 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:07 crc kubenswrapper[4773]: I0120 18:47:07.985258 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5rt2\" (UniqueName: \"kubernetes.io/projected/2dfd387a-7b46-44b7-8aed-53e919c99903-kube-api-access-n5rt2\") pod \"dnsmasq-dns-78dd6ddcc-zb8x4\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.010193 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.072099 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.451511 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2xgbd"] Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.463525 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.530393 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zb8x4"] Jan 20 18:47:08 crc kubenswrapper[4773]: W0120 18:47:08.534395 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2dfd387a_7b46_44b7_8aed_53e919c99903.slice/crio-b44d727e89fc9b9b5f69dfa9d422593aa34e541b42f15fb742dbdd5bd4f08b49 WatchSource:0}: Error finding container b44d727e89fc9b9b5f69dfa9d422593aa34e541b42f15fb742dbdd5bd4f08b49: Status 404 returned error can't find the container with id b44d727e89fc9b9b5f69dfa9d422593aa34e541b42f15fb742dbdd5bd4f08b49 Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.667495 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" event={"ID":"2dfd387a-7b46-44b7-8aed-53e919c99903","Type":"ContainerStarted","Data":"b44d727e89fc9b9b5f69dfa9d422593aa34e541b42f15fb742dbdd5bd4f08b49"} Jan 20 18:47:08 crc kubenswrapper[4773]: I0120 18:47:08.668866 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" event={"ID":"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4","Type":"ContainerStarted","Data":"6668d8f915cf882b03ecb7f9c9321d7df5e2547abe73d4876b6b1e6b3be6ef5b"} Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.424200 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2xgbd"] Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.452846 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tz8rp"] Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.454010 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.473253 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tz8rp"] Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.596715 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-config\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.596785 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-dns-svc\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.597046 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5ldb\" (UniqueName: \"kubernetes.io/projected/a00a537a-172f-4ec7-9573-dd9ac2f347e3-kube-api-access-d5ldb\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.698466 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-dns-svc\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.698854 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5ldb\" (UniqueName: \"kubernetes.io/projected/a00a537a-172f-4ec7-9573-dd9ac2f347e3-kube-api-access-d5ldb\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.698912 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-config\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.699603 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-dns-svc\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.699781 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-config\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.722218 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zb8x4"] Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.728319 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5ldb\" (UniqueName: \"kubernetes.io/projected/a00a537a-172f-4ec7-9573-dd9ac2f347e3-kube-api-access-d5ldb\") pod \"dnsmasq-dns-666b6646f7-tz8rp\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.776983 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.777392 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-75zzb"] Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.778554 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.793496 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-75zzb"] Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.902593 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-config\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.902646 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:10 crc kubenswrapper[4773]: I0120 18:47:10.902752 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qpxj5\" (UniqueName: \"kubernetes.io/projected/22a48624-e6b5-4225-baf3-c05ff3bed80d-kube-api-access-qpxj5\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.003509 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.003616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qpxj5\" (UniqueName: \"kubernetes.io/projected/22a48624-e6b5-4225-baf3-c05ff3bed80d-kube-api-access-qpxj5\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.003641 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-config\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.004523 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-config\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.005102 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.023468 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qpxj5\" (UniqueName: \"kubernetes.io/projected/22a48624-e6b5-4225-baf3-c05ff3bed80d-kube-api-access-qpxj5\") pod \"dnsmasq-dns-57d769cc4f-75zzb\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.096512 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.348978 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tz8rp"] Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.355191 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-75zzb"] Jan 20 18:47:11 crc kubenswrapper[4773]: W0120 18:47:11.357108 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda00a537a_172f_4ec7_9573_dd9ac2f347e3.slice/crio-c0ded801e725f7fd29cc632ebbd05140648619b5c0328313794582eb4a1791cf WatchSource:0}: Error finding container c0ded801e725f7fd29cc632ebbd05140648619b5c0328313794582eb4a1791cf: Status 404 returned error can't find the container with id c0ded801e725f7fd29cc632ebbd05140648619b5c0328313794582eb4a1791cf Jan 20 18:47:11 crc kubenswrapper[4773]: W0120 18:47:11.360454 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod22a48624_e6b5_4225_baf3_c05ff3bed80d.slice/crio-a9f6251d727b5d342533134b7ddbdd5288e3eaaf3fd1eb546913f3fdaa5bf268 WatchSource:0}: Error finding container a9f6251d727b5d342533134b7ddbdd5288e3eaaf3fd1eb546913f3fdaa5bf268: Status 404 returned error can't find the container with id a9f6251d727b5d342533134b7ddbdd5288e3eaaf3fd1eb546913f3fdaa5bf268 Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.690781 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" event={"ID":"22a48624-e6b5-4225-baf3-c05ff3bed80d","Type":"ContainerStarted","Data":"a9f6251d727b5d342533134b7ddbdd5288e3eaaf3fd1eb546913f3fdaa5bf268"} Jan 20 18:47:11 crc kubenswrapper[4773]: I0120 18:47:11.692791 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" event={"ID":"a00a537a-172f-4ec7-9573-dd9ac2f347e3","Type":"ContainerStarted","Data":"c0ded801e725f7fd29cc632ebbd05140648619b5c0328313794582eb4a1791cf"} Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.282920 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.284579 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.287365 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.287491 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.288764 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.289884 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.293328 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pbqbk" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.295606 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.295796 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.296245 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.303787 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.307574 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.307659 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.307718 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6z6h4" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.307588 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.307986 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.308120 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.307964 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.320438 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.330026 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.427899 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.427979 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj6nw\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-kube-api-access-wj6nw\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428026 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428048 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4dfff97-df7d-498f-9203-9c2cb0d84667-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428322 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-config-data\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428399 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428582 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428643 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428678 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428730 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltrtl\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-kube-api-access-ltrtl\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.428943 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b357137a-6e30-4ed9-a440-c9f3e90f75d8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429105 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429175 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4dfff97-df7d-498f-9203-9c2cb0d84667-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429246 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429403 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429491 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b357137a-6e30-4ed9-a440-c9f3e90f75d8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429597 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429679 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429758 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.429796 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.532086 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.532138 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b357137a-6e30-4ed9-a440-c9f3e90f75d8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.532477 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533017 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533051 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533103 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533123 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533569 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wj6nw\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-kube-api-access-wj6nw\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533608 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533645 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4dfff97-df7d-498f-9203-9c2cb0d84667-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533663 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533680 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-config-data\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533708 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533728 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533750 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533765 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533813 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533862 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ltrtl\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-kube-api-access-ltrtl\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.533908 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b357137a-6e30-4ed9-a440-c9f3e90f75d8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.534052 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.534072 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4dfff97-df7d-498f-9203-9c2cb0d84667-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.534089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.534351 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.534602 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.534780 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-server-conf\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.535638 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.535766 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.536217 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.536534 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.536841 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.537022 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.537290 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.537302 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-config-data\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.537367 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.537724 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.543048 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.543583 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b357137a-6e30-4ed9-a440-c9f3e90f75d8-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.551057 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b357137a-6e30-4ed9-a440-c9f3e90f75d8-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.552210 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.552852 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4dfff97-df7d-498f-9203-9c2cb0d84667-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.556676 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.558147 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4dfff97-df7d-498f-9203-9c2cb0d84667-pod-info\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.566294 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltrtl\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-kube-api-access-ltrtl\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.579656 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj6nw\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-kube-api-access-wj6nw\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.583135 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.593324 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.614721 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.626073 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 20 18:47:12 crc kubenswrapper[4773]: I0120 18:47:12.988392 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.054560 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.056466 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.061890 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.062923 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-lh9jh" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.065604 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.065782 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.070365 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.072061 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144269 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144333 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144452 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfm7p\" (UniqueName: \"kubernetes.io/projected/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-kube-api-access-hfm7p\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144507 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144647 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-config-data-default\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144699 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144721 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.144782 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-kolla-config\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246218 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfm7p\" (UniqueName: \"kubernetes.io/projected/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-kube-api-access-hfm7p\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246282 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246342 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-config-data-default\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246372 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246395 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246421 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-kolla-config\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.246453 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.248189 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.248204 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.249893 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-operator-scripts\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.250198 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-config-data-generated\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.250918 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-config-data-default\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.252108 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-kolla-config\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.257302 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.258361 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.272132 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfm7p\" (UniqueName: \"kubernetes.io/projected/11b243ca-6da3-4247-a1fe-2ea3e5be80cc-kube-api-access-hfm7p\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.287346 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"openstack-galera-0\" (UID: \"11b243ca-6da3-4247-a1fe-2ea3e5be80cc\") " pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.334412 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.376385 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.709716 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b357137a-6e30-4ed9-a440-c9f3e90f75d8","Type":"ContainerStarted","Data":"81b5a2b92f1105f5c420453ca19111fe1ca35ac9507a3ac978f1c848d16b5b05"} Jan 20 18:47:13 crc kubenswrapper[4773]: I0120 18:47:13.717863 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d4dfff97-df7d-498f-9203-9c2cb0d84667","Type":"ContainerStarted","Data":"9437201a24daa22de36ef5e4cb32d33d9216523028488aa287392d8e49c9e78c"} Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.004655 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Jan 20 18:47:14 crc kubenswrapper[4773]: W0120 18:47:14.098350 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod11b243ca_6da3_4247_a1fe_2ea3e5be80cc.slice/crio-e2012ff372b306b08a25075cd531fe994c7484e13e317fb42d2ebdfa8d1f414f WatchSource:0}: Error finding container e2012ff372b306b08a25075cd531fe994c7484e13e317fb42d2ebdfa8d1f414f: Status 404 returned error can't find the container with id e2012ff372b306b08a25075cd531fe994c7484e13e317fb42d2ebdfa8d1f414f Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.533171 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.534895 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.536977 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-88msx" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.537741 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.537968 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.540988 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.558183 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.683698 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.683773 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.683806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.683858 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bfe9133c-0d58-4877-97ee-5b0abeee1a95-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.683910 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe9133c-0d58-4877-97ee-5b0abeee1a95-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.684030 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.684055 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bqnc\" (UniqueName: \"kubernetes.io/projected/bfe9133c-0d58-4877-97ee-5b0abeee1a95-kube-api-access-6bqnc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.684081 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfe9133c-0d58-4877-97ee-5b0abeee1a95-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.748534 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"11b243ca-6da3-4247-a1fe-2ea3e5be80cc","Type":"ContainerStarted","Data":"e2012ff372b306b08a25075cd531fe994c7484e13e317fb42d2ebdfa8d1f414f"} Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.785535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.785585 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.786678 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.786809 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788002 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bfe9133c-0d58-4877-97ee-5b0abeee1a95-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788131 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe9133c-0d58-4877-97ee-5b0abeee1a95-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788154 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788183 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bqnc\" (UniqueName: \"kubernetes.io/projected/bfe9133c-0d58-4877-97ee-5b0abeee1a95-kube-api-access-6bqnc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788212 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfe9133c-0d58-4877-97ee-5b0abeee1a95-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788396 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/bfe9133c-0d58-4877-97ee-5b0abeee1a95-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.788549 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.789296 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bfe9133c-0d58-4877-97ee-5b0abeee1a95-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.797830 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/bfe9133c-0d58-4877-97ee-5b0abeee1a95-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.805306 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bfe9133c-0d58-4877-97ee-5b0abeee1a95-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.808863 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bqnc\" (UniqueName: \"kubernetes.io/projected/bfe9133c-0d58-4877-97ee-5b0abeee1a95-kube-api-access-6bqnc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.823757 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"openstack-cell1-galera-0\" (UID: \"bfe9133c-0d58-4877-97ee-5b0abeee1a95\") " pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.867496 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.953396 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.954682 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.956593 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-dzq72" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.957360 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.957502 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Jan 20 18:47:14 crc kubenswrapper[4773]: I0120 18:47:14.969672 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.100170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.100214 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-config-data\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.100287 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb8mf\" (UniqueName: \"kubernetes.io/projected/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-kube-api-access-bb8mf\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.100311 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.100330 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-kolla-config\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.202742 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.202778 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-config-data\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.202830 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb8mf\" (UniqueName: \"kubernetes.io/projected/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-kube-api-access-bb8mf\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.202854 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.202873 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-kolla-config\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.204154 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-kolla-config\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.204706 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-config-data\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.209723 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-memcached-tls-certs\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.210498 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-combined-ca-bundle\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.228886 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb8mf\" (UniqueName: \"kubernetes.io/projected/cb8cda87-65c5-4be7-9891-b82bcfc8e0d4-kube-api-access-bb8mf\") pod \"memcached-0\" (UID: \"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4\") " pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.300752 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.571748 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.605391 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.766586 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bfe9133c-0d58-4877-97ee-5b0abeee1a95","Type":"ContainerStarted","Data":"0ee5fe3b9e18aaafc48facf4791c76cfe84529f11deecc1670a22baf6625aa11"} Jan 20 18:47:15 crc kubenswrapper[4773]: I0120 18:47:15.769227 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4","Type":"ContainerStarted","Data":"ad3a696e356892a03596bf88ca7536ddf3794e1fc34c0c01572bf4ffbb7eeda6"} Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.618247 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.619153 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.621948 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-zrhgh" Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.639192 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.741483 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jjqgh\" (UniqueName: \"kubernetes.io/projected/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae-kube-api-access-jjqgh\") pod \"kube-state-metrics-0\" (UID: \"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae\") " pod="openstack/kube-state-metrics-0" Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.842534 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jjqgh\" (UniqueName: \"kubernetes.io/projected/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae-kube-api-access-jjqgh\") pod \"kube-state-metrics-0\" (UID: \"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae\") " pod="openstack/kube-state-metrics-0" Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.867550 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jjqgh\" (UniqueName: \"kubernetes.io/projected/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae-kube-api-access-jjqgh\") pod \"kube-state-metrics-0\" (UID: \"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae\") " pod="openstack/kube-state-metrics-0" Jan 20 18:47:16 crc kubenswrapper[4773]: I0120 18:47:16.937042 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 20 18:47:17 crc kubenswrapper[4773]: I0120 18:47:17.537197 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:47:17 crc kubenswrapper[4773]: I0120 18:47:17.794860 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae","Type":"ContainerStarted","Data":"381de174237c80b24a95594eb30259e92e84f6fa102ffa5688eefcf07e0ea711"} Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.066195 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.068614 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.070910 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.071348 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.071587 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.071917 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.074942 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.076218 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-7pgtj" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.251595 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5818e5c4-9a2c-453f-b158-f4be5ec40619-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252068 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252180 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfllw\" (UniqueName: \"kubernetes.io/projected/5818e5c4-9a2c-453f-b158-f4be5ec40619-kube-api-access-hfllw\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252242 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252266 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252288 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5818e5c4-9a2c-453f-b158-f4be5ec40619-config\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252334 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.252375 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5818e5c4-9a2c-453f-b158-f4be5ec40619-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.353967 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354123 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5818e5c4-9a2c-453f-b158-f4be5ec40619-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5818e5c4-9a2c-453f-b158-f4be5ec40619-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354270 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354332 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfllw\" (UniqueName: \"kubernetes.io/projected/5818e5c4-9a2c-453f-b158-f4be5ec40619-kube-api-access-hfllw\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354422 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354447 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354471 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5818e5c4-9a2c-453f-b158-f4be5ec40619-config\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.354647 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/5818e5c4-9a2c-453f-b158-f4be5ec40619-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.355387 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5818e5c4-9a2c-453f-b158-f4be5ec40619-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.355793 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.355980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5818e5c4-9a2c-453f-b158-f4be5ec40619-config\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.362158 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.363247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.363648 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5818e5c4-9a2c-453f-b158-f4be5ec40619-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.371571 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfllw\" (UniqueName: \"kubernetes.io/projected/5818e5c4-9a2c-453f-b158-f4be5ec40619-kube-api-access-hfllw\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.374541 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"ovsdbserver-nb-0\" (UID: \"5818e5c4-9a2c-453f-b158-f4be5ec40619\") " pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.438055 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.826326 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-t5h8j"] Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.829484 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.831922 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.832647 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.836173 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-55crm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.858865 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-5gcvm"] Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.860449 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.867234 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t5h8j"] Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.872110 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5gcvm"] Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968171 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-run\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968221 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7724g\" (UniqueName: \"kubernetes.io/projected/bada64ed-c7da-4bd9-9195-75bbdcdd0406-kube-api-access-7724g\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968249 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fce4eb9-f614-4050-a099-0a743695dcd9-ovn-controller-tls-certs\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968270 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-log-ovn\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968355 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-lib\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968388 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fce4eb9-f614-4050-a099-0a743695dcd9-combined-ca-bundle\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968431 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-run\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968449 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-log\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968484 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d45m\" (UniqueName: \"kubernetes.io/projected/2fce4eb9-f614-4050-a099-0a743695dcd9-kube-api-access-5d45m\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968550 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bada64ed-c7da-4bd9-9195-75bbdcdd0406-scripts\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968576 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fce4eb9-f614-4050-a099-0a743695dcd9-scripts\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968598 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-run-ovn\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:20 crc kubenswrapper[4773]: I0120 18:47:20.968622 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-etc-ovs\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070164 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-lib\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070242 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fce4eb9-f614-4050-a099-0a743695dcd9-combined-ca-bundle\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070284 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-run\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070305 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-log\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070353 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d45m\" (UniqueName: \"kubernetes.io/projected/2fce4eb9-f614-4050-a099-0a743695dcd9-kube-api-access-5d45m\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070382 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bada64ed-c7da-4bd9-9195-75bbdcdd0406-scripts\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070413 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fce4eb9-f614-4050-a099-0a743695dcd9-scripts\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070440 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-run-ovn\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070468 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-etc-ovs\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070511 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-run\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070537 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7724g\" (UniqueName: \"kubernetes.io/projected/bada64ed-c7da-4bd9-9195-75bbdcdd0406-kube-api-access-7724g\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070563 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fce4eb9-f614-4050-a099-0a743695dcd9-ovn-controller-tls-certs\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-log-ovn\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070759 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-lib\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070844 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-run\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070878 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-run-ovn\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.070878 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/2fce4eb9-f614-4050-a099-0a743695dcd9-var-log-ovn\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.071025 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-etc-ovs\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.071046 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-log\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.071252 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/bada64ed-c7da-4bd9-9195-75bbdcdd0406-var-run\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.075194 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bada64ed-c7da-4bd9-9195-75bbdcdd0406-scripts\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.078956 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2fce4eb9-f614-4050-a099-0a743695dcd9-scripts\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.079697 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/2fce4eb9-f614-4050-a099-0a743695dcd9-ovn-controller-tls-certs\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.087268 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2fce4eb9-f614-4050-a099-0a743695dcd9-combined-ca-bundle\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.088583 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7724g\" (UniqueName: \"kubernetes.io/projected/bada64ed-c7da-4bd9-9195-75bbdcdd0406-kube-api-access-7724g\") pod \"ovn-controller-ovs-5gcvm\" (UID: \"bada64ed-c7da-4bd9-9195-75bbdcdd0406\") " pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.091481 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d45m\" (UniqueName: \"kubernetes.io/projected/2fce4eb9-f614-4050-a099-0a743695dcd9-kube-api-access-5d45m\") pod \"ovn-controller-t5h8j\" (UID: \"2fce4eb9-f614-4050-a099-0a743695dcd9\") " pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.151034 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:21 crc kubenswrapper[4773]: I0120 18:47:21.174550 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.319981 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.321312 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.324508 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.324708 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.324999 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-j9zcb" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.325131 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.354554 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433244 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433290 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433334 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433362 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4c900f03-61d3-470c-9803-3f6b617ddf0a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433454 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c900f03-61d3-470c-9803-3f6b617ddf0a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433486 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c900f03-61d3-470c-9803-3f6b617ddf0a-config\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433506 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tsvf5\" (UniqueName: \"kubernetes.io/projected/4c900f03-61d3-470c-9803-3f6b617ddf0a-kube-api-access-tsvf5\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.433540 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535196 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c900f03-61d3-470c-9803-3f6b617ddf0a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535247 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c900f03-61d3-470c-9803-3f6b617ddf0a-config\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535294 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tsvf5\" (UniqueName: \"kubernetes.io/projected/4c900f03-61d3-470c-9803-3f6b617ddf0a-kube-api-access-tsvf5\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535332 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535370 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535389 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535414 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535435 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4c900f03-61d3-470c-9803-3f6b617ddf0a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.535945 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/4c900f03-61d3-470c-9803-3f6b617ddf0a-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.536380 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/4c900f03-61d3-470c-9803-3f6b617ddf0a-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.536547 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4c900f03-61d3-470c-9803-3f6b617ddf0a-config\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.536829 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.543202 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.554892 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tsvf5\" (UniqueName: \"kubernetes.io/projected/4c900f03-61d3-470c-9803-3f6b617ddf0a-kube-api-access-tsvf5\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.558548 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.559837 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/4c900f03-61d3-470c-9803-3f6b617ddf0a-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.566370 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"4c900f03-61d3-470c-9803-3f6b617ddf0a\") " pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:24 crc kubenswrapper[4773]: I0120 18:47:24.640057 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:40 crc kubenswrapper[4773]: E0120 18:47:40.814740 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-mariadb:current-podified" Jan 20 18:47:40 crc kubenswrapper[4773]: E0120 18:47:40.815473 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:mysql-bootstrap,Image:quay.io/podified-antelope-centos9/openstack-mariadb:current-podified,Command:[bash /var/lib/operator-scripts/mysql_bootstrap.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:True,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:mysql-db,ReadOnly:false,MountPath:/var/lib/mysql,SubPath:mysql,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-default,ReadOnly:true,MountPath:/var/lib/config-data/default,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data-generated,ReadOnly:false,MountPath:/var/lib/config-data/generated,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:operator-scripts,ReadOnly:true,MountPath:/var/lib/operator-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6bqnc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openstack-cell1-galera-0_openstack(bfe9133c-0d58-4877-97ee-5b0abeee1a95): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:47:40 crc kubenswrapper[4773]: E0120 18:47:40.817363 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/openstack-cell1-galera-0" podUID="bfe9133c-0d58-4877-97ee-5b0abeee1a95" Jan 20 18:47:40 crc kubenswrapper[4773]: E0120 18:47:40.982807 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mysql-bootstrap\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-mariadb:current-podified\\\"\"" pod="openstack/openstack-cell1-galera-0" podUID="bfe9133c-0d58-4877-97ee-5b0abeee1a95" Jan 20 18:47:41 crc kubenswrapper[4773]: E0120 18:47:41.647538 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-memcached:current-podified" Jan 20 18:47:41 crc kubenswrapper[4773]: E0120 18:47:41.647994 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:memcached,Image:quay.io/podified-antelope-centos9/openstack-memcached:current-podified,Command:[/usr/bin/dumb-init -- /usr/local/bin/kolla_start],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:memcached,HostPort:0,ContainerPort:11211,Protocol:TCP,HostIP:,},ContainerPort{Name:memcached-tls,HostPort:0,ContainerPort:11212,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:POD_IPS,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIPs,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:CONFIG_HASH,Value:nbhfh5ddh5cdhf8h687h67bh5b8h5cbh58dh4h78hd8hddh5c4h585h64bhcdhbh64dh559hb7h69h5ddh9h57ch5h75h5fch66fh8ch5ddq,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/src,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kolla-config,ReadOnly:true,MountPath:/var/lib/kolla/config_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/certs/memcached.crt,SubPath:tls.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:memcached-tls-certs,ReadOnly:true,MountPath:/var/lib/config-data/tls/private/memcached.key,SubPath:tls.key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bb8mf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:3,TimeoutSeconds:5,PeriodSeconds:3,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:nil,TCPSocket:&TCPSocketAction{Port:{0 11211 },Host:,},GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42457,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42457,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod memcached-0_openstack(cb8cda87-65c5-4be7-9891-b82bcfc8e0d4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:47:41 crc kubenswrapper[4773]: E0120 18:47:41.649237 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/memcached-0" podUID="cb8cda87-65c5-4be7-9891-b82bcfc8e0d4" Jan 20 18:47:41 crc kubenswrapper[4773]: E0120 18:47:41.993291 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"memcached\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-memcached:current-podified\\\"\"" pod="openstack/memcached-0" podUID="cb8cda87-65c5-4be7-9891-b82bcfc8e0d4" Jan 20 18:47:45 crc kubenswrapper[4773]: E0120 18:47:45.599012 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Jan 20 18:47:45 crc kubenswrapper[4773]: E0120 18:47:45.599381 4773 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" image="registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0" Jan 20 18:47:45 crc kubenswrapper[4773]: E0120 18:47:45.599527 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-state-metrics,Image:registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0,Command:[],Args:[--resources=pods --namespaces=openstack],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:http-metrics,HostPort:0,ContainerPort:8080,Protocol:TCP,HostIP:,},ContainerPort{Name:telemetry,HostPort:0,ContainerPort:8081,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jjqgh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/livez,Port:{0 8080 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:5,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-state-metrics-0_openstack(d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled" logger="UnhandledError" Jan 20 18:47:45 crc kubenswrapper[4773]: E0120 18:47:45.600700 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying layer: context canceled\"" pod="openstack/kube-state-metrics-0" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.015461 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-state-metrics\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.k8s.io/kube-state-metrics/kube-state-metrics:v2.15.0\\\"\"" pod="openstack/kube-state-metrics-0" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.426736 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.427123 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9tpc9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-2xgbd_openstack(e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.428635 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" podUID="e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.435466 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.435602 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68chd6h679hbfh55fhc6h5ffh5d8h94h56ch589hb4hc5h57bh677hcdh655h8dh667h675h654h66ch567h8fh659h5b4h675h566h55bh54h67dh6dq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-d5ldb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-666b6646f7-tz8rp_openstack(a00a537a-172f-4ec7-9573-dd9ac2f347e3): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.437207 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" podUID="a00a537a-172f-4ec7-9573-dd9ac2f347e3" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.497798 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.498211 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-n5rt2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-zb8x4_openstack(2dfd387a-7b46-44b7-8aed-53e919c99903): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.500449 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" podUID="2dfd387a-7b46-44b7-8aed-53e919c99903" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.536099 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.536277 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n659h4h664hbh658h587h67ch89h587h8fh679hc6hf9h55fh644h5d5h698h68dh5cdh5ffh669h54ch9h689hb8hd4h5bfhd8h5d7h5fh665h574q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qpxj5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-57d769cc4f-75zzb_openstack(22a48624-e6b5-4225-baf3-c05ff3bed80d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:47:46 crc kubenswrapper[4773]: E0120 18:47:46.537457 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" podUID="22a48624-e6b5-4225-baf3-c05ff3bed80d" Jan 20 18:47:46 crc kubenswrapper[4773]: I0120 18:47:46.921878 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Jan 20 18:47:46 crc kubenswrapper[4773]: I0120 18:47:46.976107 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t5h8j"] Jan 20 18:47:46 crc kubenswrapper[4773]: W0120 18:47:46.981110 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2fce4eb9_f614_4050_a099_0a743695dcd9.slice/crio-f611ddc9719f686a65d0c9b2b579664a107314a8a8d663d4260cf226efd89011 WatchSource:0}: Error finding container f611ddc9719f686a65d0c9b2b579664a107314a8a8d663d4260cf226efd89011: Status 404 returned error can't find the container with id f611ddc9719f686a65d0c9b2b579664a107314a8a8d663d4260cf226efd89011 Jan 20 18:47:47 crc kubenswrapper[4773]: W0120 18:47:47.020480 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4c900f03_61d3_470c_9803_3f6b617ddf0a.slice/crio-ce944527a221f77957002a760586186cb8f0fa60b4f49c7fe155d49e464270a6 WatchSource:0}: Error finding container ce944527a221f77957002a760586186cb8f0fa60b4f49c7fe155d49e464270a6: Status 404 returned error can't find the container with id ce944527a221f77957002a760586186cb8f0fa60b4f49c7fe155d49e464270a6 Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.020568 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"11b243ca-6da3-4247-a1fe-2ea3e5be80cc","Type":"ContainerStarted","Data":"90364fdeac7c303550cf233c55536db126c3f1b96739db6cca5f5305ac5bd779"} Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.020657 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.022654 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j" event={"ID":"2fce4eb9-f614-4050-a099-0a743695dcd9","Type":"ContainerStarted","Data":"f611ddc9719f686a65d0c9b2b579664a107314a8a8d663d4260cf226efd89011"} Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.023874 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5818e5c4-9a2c-453f-b158-f4be5ec40619","Type":"ContainerStarted","Data":"397907a0dddb31a8de7ba8daa66a368708d05cd12bb1a2803808592054be6bda"} Jan 20 18:47:47 crc kubenswrapper[4773]: E0120 18:47:47.026522 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" podUID="22a48624-e6b5-4225-baf3-c05ff3bed80d" Jan 20 18:47:47 crc kubenswrapper[4773]: E0120 18:47:47.026523 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" podUID="a00a537a-172f-4ec7-9573-dd9ac2f347e3" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.388386 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.390028 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.487814 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-config\") pod \"2dfd387a-7b46-44b7-8aed-53e919c99903\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.487871 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-config\") pod \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.487950 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n5rt2\" (UniqueName: \"kubernetes.io/projected/2dfd387a-7b46-44b7-8aed-53e919c99903-kube-api-access-n5rt2\") pod \"2dfd387a-7b46-44b7-8aed-53e919c99903\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.487972 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-dns-svc\") pod \"2dfd387a-7b46-44b7-8aed-53e919c99903\" (UID: \"2dfd387a-7b46-44b7-8aed-53e919c99903\") " Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.488142 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9tpc9\" (UniqueName: \"kubernetes.io/projected/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-kube-api-access-9tpc9\") pod \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\" (UID: \"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4\") " Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.488544 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-config" (OuterVolumeSpecName: "config") pod "2dfd387a-7b46-44b7-8aed-53e919c99903" (UID: "2dfd387a-7b46-44b7-8aed-53e919c99903"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.488817 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.493445 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-config" (OuterVolumeSpecName: "config") pod "e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4" (UID: "e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.493836 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "2dfd387a-7b46-44b7-8aed-53e919c99903" (UID: "2dfd387a-7b46-44b7-8aed-53e919c99903"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.495504 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dfd387a-7b46-44b7-8aed-53e919c99903-kube-api-access-n5rt2" (OuterVolumeSpecName: "kube-api-access-n5rt2") pod "2dfd387a-7b46-44b7-8aed-53e919c99903" (UID: "2dfd387a-7b46-44b7-8aed-53e919c99903"). InnerVolumeSpecName "kube-api-access-n5rt2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.513208 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-kube-api-access-9tpc9" (OuterVolumeSpecName: "kube-api-access-9tpc9") pod "e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4" (UID: "e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4"). InnerVolumeSpecName "kube-api-access-9tpc9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.590844 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.590880 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n5rt2\" (UniqueName: \"kubernetes.io/projected/2dfd387a-7b46-44b7-8aed-53e919c99903-kube-api-access-n5rt2\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.591099 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/2dfd387a-7b46-44b7-8aed-53e919c99903-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.591108 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9tpc9\" (UniqueName: \"kubernetes.io/projected/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4-kube-api-access-9tpc9\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:47 crc kubenswrapper[4773]: I0120 18:47:47.595995 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-5gcvm"] Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.031960 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" event={"ID":"2dfd387a-7b46-44b7-8aed-53e919c99903","Type":"ContainerDied","Data":"b44d727e89fc9b9b5f69dfa9d422593aa34e541b42f15fb742dbdd5bd4f08b49"} Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.032058 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-zb8x4" Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.035987 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4c900f03-61d3-470c-9803-3f6b617ddf0a","Type":"ContainerStarted","Data":"ce944527a221f77957002a760586186cb8f0fa60b4f49c7fe155d49e464270a6"} Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.037960 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.037959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-2xgbd" event={"ID":"e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4","Type":"ContainerDied","Data":"6668d8f915cf882b03ecb7f9c9321d7df5e2547abe73d4876b6b1e6b3be6ef5b"} Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.039167 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5gcvm" event={"ID":"bada64ed-c7da-4bd9-9195-75bbdcdd0406","Type":"ContainerStarted","Data":"ce5f6226d2493b30a9b67af06562b91019b4cb76229060cbb6768f07476f19ed"} Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.040476 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d4dfff97-df7d-498f-9203-9c2cb0d84667","Type":"ContainerStarted","Data":"1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f"} Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.043013 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b357137a-6e30-4ed9-a440-c9f3e90f75d8","Type":"ContainerStarted","Data":"582308e76cf42821e4ae7402e4d5fe864dee8caf26b6d7f6b99985263eaa82fb"} Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.077695 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zb8x4"] Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.088345 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-zb8x4"] Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.149860 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2xgbd"] Jan 20 18:47:48 crc kubenswrapper[4773]: I0120 18:47:48.152583 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-2xgbd"] Jan 20 18:47:49 crc kubenswrapper[4773]: I0120 18:47:49.456292 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dfd387a-7b46-44b7-8aed-53e919c99903" path="/var/lib/kubelet/pods/2dfd387a-7b46-44b7-8aed-53e919c99903/volumes" Jan 20 18:47:49 crc kubenswrapper[4773]: I0120 18:47:49.457080 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4" path="/var/lib/kubelet/pods/e6b756ab-c042-4cbc-a2ec-bb8d6b9094c4/volumes" Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.062500 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5818e5c4-9a2c-453f-b158-f4be5ec40619","Type":"ContainerStarted","Data":"7148ef81189f7da6562d890081d09f6e1a6d847f1f4ff4ecfef97131174ad0b7"} Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.064413 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4c900f03-61d3-470c-9803-3f6b617ddf0a","Type":"ContainerStarted","Data":"91e2e3095c5bf9a2f616b1fb8993a836773ec0298326099365b899aaf3e3c453"} Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.065616 4773 generic.go:334] "Generic (PLEG): container finished" podID="11b243ca-6da3-4247-a1fe-2ea3e5be80cc" containerID="90364fdeac7c303550cf233c55536db126c3f1b96739db6cca5f5305ac5bd779" exitCode=0 Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.065692 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"11b243ca-6da3-4247-a1fe-2ea3e5be80cc","Type":"ContainerDied","Data":"90364fdeac7c303550cf233c55536db126c3f1b96739db6cca5f5305ac5bd779"} Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.067209 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5gcvm" event={"ID":"bada64ed-c7da-4bd9-9195-75bbdcdd0406","Type":"ContainerStarted","Data":"140ae6545b5775f299a982031c8f08e25a9cdd7e9f70deaca79b00829df20194"} Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.068832 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j" event={"ID":"2fce4eb9-f614-4050-a099-0a743695dcd9","Type":"ContainerStarted","Data":"b8bd9c6cf013143eb926554716c1d42906fabc0b03ee57b125a562e3cdbdaf9e"} Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.068985 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-t5h8j" Jan 20 18:47:51 crc kubenswrapper[4773]: I0120 18:47:51.105493 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-t5h8j" podStartSLOduration=27.629158474 podStartE2EDuration="31.105475441s" podCreationTimestamp="2026-01-20 18:47:20 +0000 UTC" firstStartedPulling="2026-01-20 18:47:46.983800149 +0000 UTC m=+1059.905613173" lastFinishedPulling="2026-01-20 18:47:50.460117116 +0000 UTC m=+1063.381930140" observedRunningTime="2026-01-20 18:47:51.101486395 +0000 UTC m=+1064.023299439" watchObservedRunningTime="2026-01-20 18:47:51.105475441 +0000 UTC m=+1064.027288465" Jan 20 18:47:52 crc kubenswrapper[4773]: I0120 18:47:52.078186 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"11b243ca-6da3-4247-a1fe-2ea3e5be80cc","Type":"ContainerStarted","Data":"44afb65ac51ffbb0bfa32da6f84f67ac17b0880684349e0fd30efcf6c3037e58"} Jan 20 18:47:52 crc kubenswrapper[4773]: I0120 18:47:52.081589 4773 generic.go:334] "Generic (PLEG): container finished" podID="bada64ed-c7da-4bd9-9195-75bbdcdd0406" containerID="140ae6545b5775f299a982031c8f08e25a9cdd7e9f70deaca79b00829df20194" exitCode=0 Jan 20 18:47:52 crc kubenswrapper[4773]: I0120 18:47:52.082117 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5gcvm" event={"ID":"bada64ed-c7da-4bd9-9195-75bbdcdd0406","Type":"ContainerDied","Data":"140ae6545b5775f299a982031c8f08e25a9cdd7e9f70deaca79b00829df20194"} Jan 20 18:47:52 crc kubenswrapper[4773]: I0120 18:47:52.125525 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=8.603393186 podStartE2EDuration="40.125503276s" podCreationTimestamp="2026-01-20 18:47:12 +0000 UTC" firstStartedPulling="2026-01-20 18:47:14.120923553 +0000 UTC m=+1027.042736577" lastFinishedPulling="2026-01-20 18:47:45.643033643 +0000 UTC m=+1058.564846667" observedRunningTime="2026-01-20 18:47:52.116498319 +0000 UTC m=+1065.038311343" watchObservedRunningTime="2026-01-20 18:47:52.125503276 +0000 UTC m=+1065.047316300" Jan 20 18:47:53 crc kubenswrapper[4773]: I0120 18:47:53.092107 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5gcvm" event={"ID":"bada64ed-c7da-4bd9-9195-75bbdcdd0406","Type":"ContainerStarted","Data":"76dd5a962ff426a3eb55fbf08ddd3d739fe172ac05a461758797c6451644b570"} Jan 20 18:47:53 crc kubenswrapper[4773]: I0120 18:47:53.092472 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-5gcvm" event={"ID":"bada64ed-c7da-4bd9-9195-75bbdcdd0406","Type":"ContainerStarted","Data":"4d959497ec0bb96d17059210dfd4779cc5b4ce6b9afea2b889b9466c137577b3"} Jan 20 18:47:53 crc kubenswrapper[4773]: I0120 18:47:53.093554 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:53 crc kubenswrapper[4773]: I0120 18:47:53.093582 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:47:53 crc kubenswrapper[4773]: I0120 18:47:53.377141 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Jan 20 18:47:53 crc kubenswrapper[4773]: I0120 18:47:53.377197 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.103814 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"5818e5c4-9a2c-453f-b158-f4be5ec40619","Type":"ContainerStarted","Data":"6f73c3796bc90e5194cd970ada8705deecc5d419dd1c8d59789f7a38452be4a2"} Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.108655 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"4c900f03-61d3-470c-9803-3f6b617ddf0a","Type":"ContainerStarted","Data":"9e2d0dc0ea50be4df20c0569611b3a134aca609e02a64146777d39153fa055de"} Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.136264 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=28.212972242 podStartE2EDuration="35.13624095s" podCreationTimestamp="2026-01-20 18:47:19 +0000 UTC" firstStartedPulling="2026-01-20 18:47:46.940091105 +0000 UTC m=+1059.861904129" lastFinishedPulling="2026-01-20 18:47:53.863359813 +0000 UTC m=+1066.785172837" observedRunningTime="2026-01-20 18:47:54.122903428 +0000 UTC m=+1067.044716462" watchObservedRunningTime="2026-01-20 18:47:54.13624095 +0000 UTC m=+1067.058053984" Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.136415 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-5gcvm" podStartSLOduration=31.278450309 podStartE2EDuration="34.136409784s" podCreationTimestamp="2026-01-20 18:47:20 +0000 UTC" firstStartedPulling="2026-01-20 18:47:47.602312796 +0000 UTC m=+1060.524125820" lastFinishedPulling="2026-01-20 18:47:50.460272261 +0000 UTC m=+1063.382085295" observedRunningTime="2026-01-20 18:47:53.116092642 +0000 UTC m=+1066.037905666" watchObservedRunningTime="2026-01-20 18:47:54.136409784 +0000 UTC m=+1067.058222818" Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.153159 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=24.297612373 podStartE2EDuration="31.153142437s" podCreationTimestamp="2026-01-20 18:47:23 +0000 UTC" firstStartedPulling="2026-01-20 18:47:47.022855041 +0000 UTC m=+1059.944668065" lastFinishedPulling="2026-01-20 18:47:53.878385105 +0000 UTC m=+1066.800198129" observedRunningTime="2026-01-20 18:47:54.1462271 +0000 UTC m=+1067.068040124" watchObservedRunningTime="2026-01-20 18:47:54.153142437 +0000 UTC m=+1067.074955461" Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.640092 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.640145 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:54 crc kubenswrapper[4773]: I0120 18:47:54.681616 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.159152 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.419818 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-75zzb"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.439158 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.478767 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-g2pwb"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.480189 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.482006 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.488908 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-xs9zd"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.489842 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.494174 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.505341 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-g2pwb"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.516557 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xs9zd"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.664624 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tz8rp"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681161 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-588tk\" (UniqueName: \"kubernetes.io/projected/f7c93b98-cee9-4ca4-af53-0a939fece59b-kube-api-access-588tk\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681232 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681284 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-ovs-rundir\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681388 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-config\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681446 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-ovn-rundir\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681546 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-combined-ca-bundle\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-config\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681645 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.681763 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7kxx\" (UniqueName: \"kubernetes.io/projected/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-kube-api-access-h7kxx\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.695552 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2gpg"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.716405 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2gpg"] Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.716638 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.723714 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.782946 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-combined-ca-bundle\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783006 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-config\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783033 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783091 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h7kxx\" (UniqueName: \"kubernetes.io/projected/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-kube-api-access-h7kxx\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783129 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-588tk\" (UniqueName: \"kubernetes.io/projected/f7c93b98-cee9-4ca4-af53-0a939fece59b-kube-api-access-588tk\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783150 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783174 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-ovs-rundir\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783224 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-config\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783250 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783294 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-ovn-rundir\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.783595 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-ovn-rundir\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.784598 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-ovs-rundir\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.784604 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-dns-svc\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.784723 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-config\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.785050 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-config\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.785070 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-ovsdbserver-sb\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.788981 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.789031 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-combined-ca-bundle\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.802904 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-588tk\" (UniqueName: \"kubernetes.io/projected/f7c93b98-cee9-4ca4-af53-0a939fece59b-kube-api-access-588tk\") pod \"dnsmasq-dns-7f896c8c65-g2pwb\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.808480 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.812700 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7kxx\" (UniqueName: \"kubernetes.io/projected/a5ceb1c5-1dbc-4810-95c9-c1ac0b915542-kube-api-access-h7kxx\") pod \"ovn-controller-metrics-xs9zd\" (UID: \"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542\") " pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.820973 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-xs9zd" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.885230 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.885305 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-config\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.885399 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.885448 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbd7l\" (UniqueName: \"kubernetes.io/projected/aba9326a-e499-43a8-9f50-4dc29d62c960-kube-api-access-wbd7l\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.885480 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.922303 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.969157 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.987030 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.987103 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbd7l\" (UniqueName: \"kubernetes.io/projected/aba9326a-e499-43a8-9f50-4dc29d62c960-kube-api-access-wbd7l\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.987128 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.987176 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.987214 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-config\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.990612 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-config\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.990978 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-nb\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.991084 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-sb\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:55 crc kubenswrapper[4773]: I0120 18:47:55.991240 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-dns-svc\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.006146 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbd7l\" (UniqueName: \"kubernetes.io/projected/aba9326a-e499-43a8-9f50-4dc29d62c960-kube-api-access-wbd7l\") pod \"dnsmasq-dns-86db49b7ff-k2gpg\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.036910 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.087865 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-dns-svc\") pod \"22a48624-e6b5-4225-baf3-c05ff3bed80d\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088033 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d5ldb\" (UniqueName: \"kubernetes.io/projected/a00a537a-172f-4ec7-9573-dd9ac2f347e3-kube-api-access-d5ldb\") pod \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088069 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-dns-svc\") pod \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088159 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-config\") pod \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\" (UID: \"a00a537a-172f-4ec7-9573-dd9ac2f347e3\") " Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088208 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-config\") pod \"22a48624-e6b5-4225-baf3-c05ff3bed80d\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088229 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qpxj5\" (UniqueName: \"kubernetes.io/projected/22a48624-e6b5-4225-baf3-c05ff3bed80d-kube-api-access-qpxj5\") pod \"22a48624-e6b5-4225-baf3-c05ff3bed80d\" (UID: \"22a48624-e6b5-4225-baf3-c05ff3bed80d\") " Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088642 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-config" (OuterVolumeSpecName: "config") pod "a00a537a-172f-4ec7-9573-dd9ac2f347e3" (UID: "a00a537a-172f-4ec7-9573-dd9ac2f347e3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088664 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-config" (OuterVolumeSpecName: "config") pod "22a48624-e6b5-4225-baf3-c05ff3bed80d" (UID: "22a48624-e6b5-4225-baf3-c05ff3bed80d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.088785 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a00a537a-172f-4ec7-9573-dd9ac2f347e3" (UID: "a00a537a-172f-4ec7-9573-dd9ac2f347e3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.089143 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "22a48624-e6b5-4225-baf3-c05ff3bed80d" (UID: "22a48624-e6b5-4225-baf3-c05ff3bed80d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.091413 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22a48624-e6b5-4225-baf3-c05ff3bed80d-kube-api-access-qpxj5" (OuterVolumeSpecName: "kube-api-access-qpxj5") pod "22a48624-e6b5-4225-baf3-c05ff3bed80d" (UID: "22a48624-e6b5-4225-baf3-c05ff3bed80d"). InnerVolumeSpecName "kube-api-access-qpxj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.096282 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a00a537a-172f-4ec7-9573-dd9ac2f347e3-kube-api-access-d5ldb" (OuterVolumeSpecName: "kube-api-access-d5ldb") pod "a00a537a-172f-4ec7-9573-dd9ac2f347e3" (UID: "a00a537a-172f-4ec7-9573-dd9ac2f347e3"). InnerVolumeSpecName "kube-api-access-d5ldb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.126175 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" event={"ID":"22a48624-e6b5-4225-baf3-c05ff3bed80d","Type":"ContainerDied","Data":"a9f6251d727b5d342533134b7ddbdd5288e3eaaf3fd1eb546913f3fdaa5bf268"} Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.126271 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-75zzb" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.131213 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.131363 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-666b6646f7-tz8rp" event={"ID":"a00a537a-172f-4ec7-9573-dd9ac2f347e3","Type":"ContainerDied","Data":"c0ded801e725f7fd29cc632ebbd05140648619b5c0328313794582eb4a1791cf"} Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.189669 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.189702 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d5ldb\" (UniqueName: \"kubernetes.io/projected/a00a537a-172f-4ec7-9573-dd9ac2f347e3-kube-api-access-d5ldb\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.189714 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.189724 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a00a537a-172f-4ec7-9573-dd9ac2f347e3-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.189736 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22a48624-e6b5-4225-baf3-c05ff3bed80d-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.189747 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qpxj5\" (UniqueName: \"kubernetes.io/projected/22a48624-e6b5-4225-baf3-c05ff3bed80d-kube-api-access-qpxj5\") on node \"crc\" DevicePath \"\"" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.206998 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tz8rp"] Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.217526 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-666b6646f7-tz8rp"] Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.241535 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-75zzb"] Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.249861 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-75zzb"] Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.279482 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-g2pwb"] Jan 20 18:47:56 crc kubenswrapper[4773]: W0120 18:47:56.367376 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5ceb1c5_1dbc_4810_95c9_c1ac0b915542.slice/crio-68de05642559e2a85e816bf48496d9b70430042e2fecd02bcf383202d6aa65a0 WatchSource:0}: Error finding container 68de05642559e2a85e816bf48496d9b70430042e2fecd02bcf383202d6aa65a0: Status 404 returned error can't find the container with id 68de05642559e2a85e816bf48496d9b70430042e2fecd02bcf383202d6aa65a0 Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.375092 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-xs9zd"] Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.439021 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.517640 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:56 crc kubenswrapper[4773]: I0120 18:47:56.543119 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2gpg"] Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.152498 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bfe9133c-0d58-4877-97ee-5b0abeee1a95","Type":"ContainerStarted","Data":"bb1a743b0011c8852a174782d50865209e340613e556f18abff865390cc51c2b"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.155546 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"cb8cda87-65c5-4be7-9891-b82bcfc8e0d4","Type":"ContainerStarted","Data":"fd5c88904eb01081abb48040c0407ee8e8b2891234254b33e7cfd8a35fd7f534"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.157098 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.165858 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" event={"ID":"aba9326a-e499-43a8-9f50-4dc29d62c960","Type":"ContainerStarted","Data":"ce5e0274d2b85b8e9dc275c7b174cafb35ea6aacac9812349bb550681a566a4f"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.166157 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" event={"ID":"aba9326a-e499-43a8-9f50-4dc29d62c960","Type":"ContainerStarted","Data":"4264d71b8e07e5eb9a8f6822d3dd22ead9577270d589def4baeb9a9c2e4760f2"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.172251 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" event={"ID":"f7c93b98-cee9-4ca4-af53-0a939fece59b","Type":"ContainerStarted","Data":"3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.172308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" event={"ID":"f7c93b98-cee9-4ca4-af53-0a939fece59b","Type":"ContainerStarted","Data":"0ac79f3c39c06ac9f58156f40c2aaee8557c1dbe4bc8966a0b68c584c793d8d2"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.176352 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xs9zd" event={"ID":"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542","Type":"ContainerStarted","Data":"c5b27e44cc2587a9ad27aa31efc4210c595b01c54234dff6e697ee360829be1d"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.176386 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-xs9zd" event={"ID":"a5ceb1c5-1dbc-4810-95c9-c1ac0b915542","Type":"ContainerStarted","Data":"68de05642559e2a85e816bf48496d9b70430042e2fecd02bcf383202d6aa65a0"} Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.207473 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=2.385477634 podStartE2EDuration="43.207456603s" podCreationTimestamp="2026-01-20 18:47:14 +0000 UTC" firstStartedPulling="2026-01-20 18:47:15.639781241 +0000 UTC m=+1028.561594275" lastFinishedPulling="2026-01-20 18:47:56.46176022 +0000 UTC m=+1069.383573244" observedRunningTime="2026-01-20 18:47:57.204387689 +0000 UTC m=+1070.126200733" watchObservedRunningTime="2026-01-20 18:47:57.207456603 +0000 UTC m=+1070.129269627" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.237558 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.250162 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-xs9zd" podStartSLOduration=2.250141862 podStartE2EDuration="2.250141862s" podCreationTimestamp="2026-01-20 18:47:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:47:57.226507623 +0000 UTC m=+1070.148320647" watchObservedRunningTime="2026-01-20 18:47:57.250141862 +0000 UTC m=+1070.171954896" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.459143 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22a48624-e6b5-4225-baf3-c05ff3bed80d" path="/var/lib/kubelet/pods/22a48624-e6b5-4225-baf3-c05ff3bed80d/volumes" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.459713 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a00a537a-172f-4ec7-9573-dd9ac2f347e3" path="/var/lib/kubelet/pods/a00a537a-172f-4ec7-9573-dd9ac2f347e3/volumes" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.522348 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.523577 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.525547 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.525914 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-7w57g" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.526097 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.527397 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.539262 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.625560 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.625638 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.625663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152ecb39-d580-4c8d-b572-e3a6bb070c7f-config\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.625691 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.625710 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/152ecb39-d580-4c8d-b572-e3a6bb070c7f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.625724 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drmfz\" (UniqueName: \"kubernetes.io/projected/152ecb39-d580-4c8d-b572-e3a6bb070c7f-kube-api-access-drmfz\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.626038 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/152ecb39-d580-4c8d-b572-e3a6bb070c7f-scripts\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727656 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727724 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727777 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152ecb39-d580-4c8d-b572-e3a6bb070c7f-config\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727815 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727842 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/152ecb39-d580-4c8d-b572-e3a6bb070c7f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727858 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drmfz\" (UniqueName: \"kubernetes.io/projected/152ecb39-d580-4c8d-b572-e3a6bb070c7f-kube-api-access-drmfz\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.727921 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/152ecb39-d580-4c8d-b572-e3a6bb070c7f-scripts\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.728686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/152ecb39-d580-4c8d-b572-e3a6bb070c7f-scripts\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.728750 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/152ecb39-d580-4c8d-b572-e3a6bb070c7f-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.728782 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/152ecb39-d580-4c8d-b572-e3a6bb070c7f-config\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.733074 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.733182 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.738662 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/152ecb39-d580-4c8d-b572-e3a6bb070c7f-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.749392 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drmfz\" (UniqueName: \"kubernetes.io/projected/152ecb39-d580-4c8d-b572-e3a6bb070c7f-kube-api-access-drmfz\") pod \"ovn-northd-0\" (UID: \"152ecb39-d580-4c8d-b572-e3a6bb070c7f\") " pod="openstack/ovn-northd-0" Jan 20 18:47:57 crc kubenswrapper[4773]: I0120 18:47:57.869495 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.186798 4773 generic.go:334] "Generic (PLEG): container finished" podID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerID="ce5e0274d2b85b8e9dc275c7b174cafb35ea6aacac9812349bb550681a566a4f" exitCode=0 Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.187107 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" event={"ID":"aba9326a-e499-43a8-9f50-4dc29d62c960","Type":"ContainerDied","Data":"ce5e0274d2b85b8e9dc275c7b174cafb35ea6aacac9812349bb550681a566a4f"} Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.187201 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" event={"ID":"aba9326a-e499-43a8-9f50-4dc29d62c960","Type":"ContainerStarted","Data":"ea4e2f85245dbb3b24a2d2a6ed359fd69004c830d495d4be0b7bfbd5a1629207"} Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.187282 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.189603 4773 generic.go:334] "Generic (PLEG): container finished" podID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerID="3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb" exitCode=0 Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.189674 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" event={"ID":"f7c93b98-cee9-4ca4-af53-0a939fece59b","Type":"ContainerDied","Data":"3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb"} Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.189734 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" event={"ID":"f7c93b98-cee9-4ca4-af53-0a939fece59b","Type":"ContainerStarted","Data":"d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea"} Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.190860 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.212119 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" podStartSLOduration=2.7823644610000002 podStartE2EDuration="3.212103228s" podCreationTimestamp="2026-01-20 18:47:55 +0000 UTC" firstStartedPulling="2026-01-20 18:47:56.556553436 +0000 UTC m=+1069.478366470" lastFinishedPulling="2026-01-20 18:47:56.986292223 +0000 UTC m=+1069.908105237" observedRunningTime="2026-01-20 18:47:58.209187938 +0000 UTC m=+1071.131000962" watchObservedRunningTime="2026-01-20 18:47:58.212103228 +0000 UTC m=+1071.133916242" Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.234597 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" podStartSLOduration=2.605404284 podStartE2EDuration="3.234579149s" podCreationTimestamp="2026-01-20 18:47:55 +0000 UTC" firstStartedPulling="2026-01-20 18:47:56.291322162 +0000 UTC m=+1069.213135186" lastFinishedPulling="2026-01-20 18:47:56.920497027 +0000 UTC m=+1069.842310051" observedRunningTime="2026-01-20 18:47:58.231437584 +0000 UTC m=+1071.153250608" watchObservedRunningTime="2026-01-20 18:47:58.234579149 +0000 UTC m=+1071.156392173" Jan 20 18:47:58 crc kubenswrapper[4773]: I0120 18:47:58.396184 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Jan 20 18:47:59 crc kubenswrapper[4773]: I0120 18:47:59.197766 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"152ecb39-d580-4c8d-b572-e3a6bb070c7f","Type":"ContainerStarted","Data":"38ea9378adbfaecb468fa8a1b251573fd8651ef7cc4484900b8aebbd8dc7d654"} Jan 20 18:47:59 crc kubenswrapper[4773]: I0120 18:47:59.476692 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Jan 20 18:47:59 crc kubenswrapper[4773]: I0120 18:47:59.565558 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.214691 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae","Type":"ContainerStarted","Data":"0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382"} Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.215252 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.216073 4773 generic.go:334] "Generic (PLEG): container finished" podID="bfe9133c-0d58-4877-97ee-5b0abeee1a95" containerID="bb1a743b0011c8852a174782d50865209e340613e556f18abff865390cc51c2b" exitCode=0 Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.216144 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bfe9133c-0d58-4877-97ee-5b0abeee1a95","Type":"ContainerDied","Data":"bb1a743b0011c8852a174782d50865209e340613e556f18abff865390cc51c2b"} Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.220742 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"152ecb39-d580-4c8d-b572-e3a6bb070c7f","Type":"ContainerStarted","Data":"e1fe5e0dd3e927154800bb7ab92dd110d34928df22cd240619a05e378f41b981"} Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.220768 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.220779 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"152ecb39-d580-4c8d-b572-e3a6bb070c7f","Type":"ContainerStarted","Data":"a9882d23a0b277ed94eaad4fc602a61a222cefd16c506c6f53e6a6e88ed8b7d3"} Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.236568 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.622350898 podStartE2EDuration="44.236553832s" podCreationTimestamp="2026-01-20 18:47:16 +0000 UTC" firstStartedPulling="2026-01-20 18:47:17.580255231 +0000 UTC m=+1030.502068255" lastFinishedPulling="2026-01-20 18:47:59.194458165 +0000 UTC m=+1072.116271189" observedRunningTime="2026-01-20 18:48:00.233827796 +0000 UTC m=+1073.155640820" watchObservedRunningTime="2026-01-20 18:48:00.236553832 +0000 UTC m=+1073.158366846" Jan 20 18:48:00 crc kubenswrapper[4773]: I0120 18:48:00.301888 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.035947434 podStartE2EDuration="3.301862896s" podCreationTimestamp="2026-01-20 18:47:57 +0000 UTC" firstStartedPulling="2026-01-20 18:47:58.483152341 +0000 UTC m=+1071.404965365" lastFinishedPulling="2026-01-20 18:47:59.749067803 +0000 UTC m=+1072.670880827" observedRunningTime="2026-01-20 18:48:00.287451348 +0000 UTC m=+1073.209264382" watchObservedRunningTime="2026-01-20 18:48:00.301862896 +0000 UTC m=+1073.223675920" Jan 20 18:48:01 crc kubenswrapper[4773]: I0120 18:48:01.228852 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"bfe9133c-0d58-4877-97ee-5b0abeee1a95","Type":"ContainerStarted","Data":"5522656ebb75af39ada2f53370d115ae8f0a73bc77151e606425d9a05af78373"} Jan 20 18:48:01 crc kubenswrapper[4773]: I0120 18:48:01.255576 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=-9223371988.599232 podStartE2EDuration="48.255544742s" podCreationTimestamp="2026-01-20 18:47:13 +0000 UTC" firstStartedPulling="2026-01-20 18:47:15.588351512 +0000 UTC m=+1028.510164536" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:01.246786081 +0000 UTC m=+1074.168599115" watchObservedRunningTime="2026-01-20 18:48:01.255544742 +0000 UTC m=+1074.177357786" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.122878 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-dnds2"] Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.124209 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.129324 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.133778 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dnds2"] Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.203347 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce796025-4e2f-439c-9fab-20c8295a792c-operator-scripts\") pod \"root-account-create-update-dnds2\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.203452 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5s2gq\" (UniqueName: \"kubernetes.io/projected/ce796025-4e2f-439c-9fab-20c8295a792c-kube-api-access-5s2gq\") pod \"root-account-create-update-dnds2\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.304816 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce796025-4e2f-439c-9fab-20c8295a792c-operator-scripts\") pod \"root-account-create-update-dnds2\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.304916 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5s2gq\" (UniqueName: \"kubernetes.io/projected/ce796025-4e2f-439c-9fab-20c8295a792c-kube-api-access-5s2gq\") pod \"root-account-create-update-dnds2\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.306099 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce796025-4e2f-439c-9fab-20c8295a792c-operator-scripts\") pod \"root-account-create-update-dnds2\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.324649 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5s2gq\" (UniqueName: \"kubernetes.io/projected/ce796025-4e2f-439c-9fab-20c8295a792c-kube-api-access-5s2gq\") pod \"root-account-create-update-dnds2\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.444919 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:02 crc kubenswrapper[4773]: I0120 18:48:02.878573 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-dnds2"] Jan 20 18:48:03 crc kubenswrapper[4773]: I0120 18:48:03.242515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dnds2" event={"ID":"ce796025-4e2f-439c-9fab-20c8295a792c","Type":"ContainerStarted","Data":"b80da81735f34173a89a48a7d319907472e0d78062e31cce94b57e8d8226a307"} Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.252469 4773 generic.go:334] "Generic (PLEG): container finished" podID="ce796025-4e2f-439c-9fab-20c8295a792c" containerID="fe986dbc9aa7abb1946cbbaf36610eba367f6b9655e2f6cf2645119cfbe827cd" exitCode=0 Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.252532 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dnds2" event={"ID":"ce796025-4e2f-439c-9fab-20c8295a792c","Type":"ContainerDied","Data":"fe986dbc9aa7abb1946cbbaf36610eba367f6b9655e2f6cf2645119cfbe827cd"} Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.827027 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-xjqwr"] Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.828536 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.839510 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xjqwr"] Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.868066 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.868122 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.926053 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-fb33-account-create-update-2nkdm"] Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.927196 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.933592 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.934444 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fb33-account-create-update-2nkdm"] Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.949681 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-operator-scripts\") pod \"keystone-db-create-xjqwr\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.949738 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnkqw\" (UniqueName: \"kubernetes.io/projected/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-kube-api-access-qnkqw\") pod \"keystone-db-create-xjqwr\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:04 crc kubenswrapper[4773]: I0120 18:48:04.969858 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.051881 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-operator-scripts\") pod \"keystone-fb33-account-create-update-2nkdm\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.052321 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-operator-scripts\") pod \"keystone-db-create-xjqwr\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.052381 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wl85\" (UniqueName: \"kubernetes.io/projected/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-kube-api-access-7wl85\") pod \"keystone-fb33-account-create-update-2nkdm\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.052459 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnkqw\" (UniqueName: \"kubernetes.io/projected/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-kube-api-access-qnkqw\") pod \"keystone-db-create-xjqwr\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.053206 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-operator-scripts\") pod \"keystone-db-create-xjqwr\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.070571 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnkqw\" (UniqueName: \"kubernetes.io/projected/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-kube-api-access-qnkqw\") pod \"keystone-db-create-xjqwr\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.143984 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.154269 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wl85\" (UniqueName: \"kubernetes.io/projected/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-kube-api-access-7wl85\") pod \"keystone-fb33-account-create-update-2nkdm\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.154358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-operator-scripts\") pod \"keystone-fb33-account-create-update-2nkdm\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.155063 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-operator-scripts\") pod \"keystone-fb33-account-create-update-2nkdm\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.167302 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-8pd22"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.168302 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.173868 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wl85\" (UniqueName: \"kubernetes.io/projected/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-kube-api-access-7wl85\") pod \"keystone-fb33-account-create-update-2nkdm\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.185845 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8pd22"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.243080 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.287237 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-f756-account-create-update-tlxkm"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.288241 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.292225 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.299321 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f756-account-create-update-tlxkm"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.303108 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.357254 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqzds\" (UniqueName: \"kubernetes.io/projected/0c64cf4d-562e-4a78-a22b-d682436d5db3-kube-api-access-nqzds\") pod \"placement-db-create-8pd22\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.357384 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64cf4d-562e-4a78-a22b-d682436d5db3-operator-scripts\") pod \"placement-db-create-8pd22\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.390964 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.458288 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqzds\" (UniqueName: \"kubernetes.io/projected/0c64cf4d-562e-4a78-a22b-d682436d5db3-kube-api-access-nqzds\") pod \"placement-db-create-8pd22\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.458439 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484e46fc-ebda-496a-9884-295fcd065e9b-operator-scripts\") pod \"placement-f756-account-create-update-tlxkm\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.458526 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9d9d\" (UniqueName: \"kubernetes.io/projected/484e46fc-ebda-496a-9884-295fcd065e9b-kube-api-access-p9d9d\") pod \"placement-f756-account-create-update-tlxkm\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.458655 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64cf4d-562e-4a78-a22b-d682436d5db3-operator-scripts\") pod \"placement-db-create-8pd22\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.463354 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64cf4d-562e-4a78-a22b-d682436d5db3-operator-scripts\") pod \"placement-db-create-8pd22\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.477347 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqzds\" (UniqueName: \"kubernetes.io/projected/0c64cf4d-562e-4a78-a22b-d682436d5db3-kube-api-access-nqzds\") pod \"placement-db-create-8pd22\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.527655 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8bf57"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.529882 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.535141 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8bf57"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.568110 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484e46fc-ebda-496a-9884-295fcd065e9b-operator-scripts\") pod \"placement-f756-account-create-update-tlxkm\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.568171 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p9d9d\" (UniqueName: \"kubernetes.io/projected/484e46fc-ebda-496a-9884-295fcd065e9b-kube-api-access-p9d9d\") pod \"placement-f756-account-create-update-tlxkm\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.569997 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484e46fc-ebda-496a-9884-295fcd065e9b-operator-scripts\") pod \"placement-f756-account-create-update-tlxkm\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.594810 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9d9d\" (UniqueName: \"kubernetes.io/projected/484e46fc-ebda-496a-9884-295fcd065e9b-kube-api-access-p9d9d\") pod \"placement-f756-account-create-update-tlxkm\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.597170 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-18ee-account-create-update-llcxn"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.598431 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.600890 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.611723 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-18ee-account-create-update-llcxn"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.643161 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8pd22" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.650860 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-xjqwr"] Jan 20 18:48:05 crc kubenswrapper[4773]: W0120 18:48:05.661806 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba7d65a8_afba_4e1a_a7e6_b9483c97fdcb.slice/crio-a987ff03a347605bc29a2925c308ab5d8a186e07da55624ddd1c1c8879b773d9 WatchSource:0}: Error finding container a987ff03a347605bc29a2925c308ab5d8a186e07da55624ddd1c1c8879b773d9: Status 404 returned error can't find the container with id a987ff03a347605bc29a2925c308ab5d8a186e07da55624ddd1c1c8879b773d9 Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.661885 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.672391 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-operator-scripts\") pod \"glance-db-create-8bf57\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.672453 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdgx7\" (UniqueName: \"kubernetes.io/projected/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-kube-api-access-fdgx7\") pod \"glance-db-create-8bf57\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.694565 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.774059 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce8f955-26cb-4860-afc1-effceac1d7a4-operator-scripts\") pod \"glance-18ee-account-create-update-llcxn\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.774181 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-operator-scripts\") pod \"glance-db-create-8bf57\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.774205 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdgx7\" (UniqueName: \"kubernetes.io/projected/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-kube-api-access-fdgx7\") pod \"glance-db-create-8bf57\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.774222 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cm6g\" (UniqueName: \"kubernetes.io/projected/2ce8f955-26cb-4860-afc1-effceac1d7a4-kube-api-access-9cm6g\") pod \"glance-18ee-account-create-update-llcxn\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.775001 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-operator-scripts\") pod \"glance-db-create-8bf57\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.794828 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdgx7\" (UniqueName: \"kubernetes.io/projected/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-kube-api-access-fdgx7\") pod \"glance-db-create-8bf57\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.811174 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.854604 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-fb33-account-create-update-2nkdm"] Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.857993 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8bf57" Jan 20 18:48:05 crc kubenswrapper[4773]: W0120 18:48:05.873283 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6e707f5_41a8_43c6_976a_7a9645c0b0ca.slice/crio-c35c88e6b8f82028f890aa0f52430d327b2c4c64fd2b524a32b59197f9562841 WatchSource:0}: Error finding container c35c88e6b8f82028f890aa0f52430d327b2c4c64fd2b524a32b59197f9562841: Status 404 returned error can't find the container with id c35c88e6b8f82028f890aa0f52430d327b2c4c64fd2b524a32b59197f9562841 Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.875485 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce796025-4e2f-439c-9fab-20c8295a792c-operator-scripts\") pod \"ce796025-4e2f-439c-9fab-20c8295a792c\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.875583 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5s2gq\" (UniqueName: \"kubernetes.io/projected/ce796025-4e2f-439c-9fab-20c8295a792c-kube-api-access-5s2gq\") pod \"ce796025-4e2f-439c-9fab-20c8295a792c\" (UID: \"ce796025-4e2f-439c-9fab-20c8295a792c\") " Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.875952 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cm6g\" (UniqueName: \"kubernetes.io/projected/2ce8f955-26cb-4860-afc1-effceac1d7a4-kube-api-access-9cm6g\") pod \"glance-18ee-account-create-update-llcxn\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.876032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce8f955-26cb-4860-afc1-effceac1d7a4-operator-scripts\") pod \"glance-18ee-account-create-update-llcxn\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.876366 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce796025-4e2f-439c-9fab-20c8295a792c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ce796025-4e2f-439c-9fab-20c8295a792c" (UID: "ce796025-4e2f-439c-9fab-20c8295a792c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.880692 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce796025-4e2f-439c-9fab-20c8295a792c-kube-api-access-5s2gq" (OuterVolumeSpecName: "kube-api-access-5s2gq") pod "ce796025-4e2f-439c-9fab-20c8295a792c" (UID: "ce796025-4e2f-439c-9fab-20c8295a792c"). InnerVolumeSpecName "kube-api-access-5s2gq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.880901 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce8f955-26cb-4860-afc1-effceac1d7a4-operator-scripts\") pod \"glance-18ee-account-create-update-llcxn\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.899506 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cm6g\" (UniqueName: \"kubernetes.io/projected/2ce8f955-26cb-4860-afc1-effceac1d7a4-kube-api-access-9cm6g\") pod \"glance-18ee-account-create-update-llcxn\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.925197 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.977993 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ce796025-4e2f-439c-9fab-20c8295a792c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:05 crc kubenswrapper[4773]: I0120 18:48:05.978020 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5s2gq\" (UniqueName: \"kubernetes.io/projected/ce796025-4e2f-439c-9fab-20c8295a792c-kube-api-access-5s2gq\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.039240 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.099827 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-g2pwb"] Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.157218 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-8pd22"] Jan 20 18:48:06 crc kubenswrapper[4773]: W0120 18:48:06.168819 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0c64cf4d_562e_4a78_a22b_d682436d5db3.slice/crio-e1ec142f5c3ab6fc72698265a83e114f22a335151038c0a5d6cead32470d2827 WatchSource:0}: Error finding container e1ec142f5c3ab6fc72698265a83e114f22a335151038c0a5d6cead32470d2827: Status 404 returned error can't find the container with id e1ec142f5c3ab6fc72698265a83e114f22a335151038c0a5d6cead32470d2827 Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.196048 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-f756-account-create-update-tlxkm"] Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.285172 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-dnds2" event={"ID":"ce796025-4e2f-439c-9fab-20c8295a792c","Type":"ContainerDied","Data":"b80da81735f34173a89a48a7d319907472e0d78062e31cce94b57e8d8226a307"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.285514 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b80da81735f34173a89a48a7d319907472e0d78062e31cce94b57e8d8226a307" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.285183 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-dnds2" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.290670 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fb33-account-create-update-2nkdm" event={"ID":"c6e707f5-41a8-43c6-976a-7a9645c0b0ca","Type":"ContainerStarted","Data":"bf0e9d7311fa3cf82ace95c60a06b7e7384341e3a3deaf1802d90fe3b93f2f6a"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.290719 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fb33-account-create-update-2nkdm" event={"ID":"c6e707f5-41a8-43c6-976a-7a9645c0b0ca","Type":"ContainerStarted","Data":"c35c88e6b8f82028f890aa0f52430d327b2c4c64fd2b524a32b59197f9562841"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.306645 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-fb33-account-create-update-2nkdm" podStartSLOduration=2.306625366 podStartE2EDuration="2.306625366s" podCreationTimestamp="2026-01-20 18:48:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:06.305913319 +0000 UTC m=+1079.227726343" watchObservedRunningTime="2026-01-20 18:48:06.306625366 +0000 UTC m=+1079.228438390" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.309263 4773 generic.go:334] "Generic (PLEG): container finished" podID="ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" containerID="e203a211f93f27ff720239411e192a2a8202e1fdc890ba783dd23386fddbb4d9" exitCode=0 Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.309325 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xjqwr" event={"ID":"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb","Type":"ContainerDied","Data":"e203a211f93f27ff720239411e192a2a8202e1fdc890ba783dd23386fddbb4d9"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.309350 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xjqwr" event={"ID":"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb","Type":"ContainerStarted","Data":"a987ff03a347605bc29a2925c308ab5d8a186e07da55624ddd1c1c8879b773d9"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.310667 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8pd22" event={"ID":"0c64cf4d-562e-4a78-a22b-d682436d5db3","Type":"ContainerStarted","Data":"e1ec142f5c3ab6fc72698265a83e114f22a335151038c0a5d6cead32470d2827"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.311750 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerName="dnsmasq-dns" containerID="cri-o://d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea" gracePeriod=10 Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.311947 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f756-account-create-update-tlxkm" event={"ID":"484e46fc-ebda-496a-9884-295fcd065e9b","Type":"ContainerStarted","Data":"6c0ddb1490bd747f1bba344e9687839219cf6cfc969f7344d057f8811585e3e5"} Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.389253 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8bf57"] Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.450834 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-18ee-account-create-update-llcxn"] Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.727871 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.891396 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-588tk\" (UniqueName: \"kubernetes.io/projected/f7c93b98-cee9-4ca4-af53-0a939fece59b-kube-api-access-588tk\") pod \"f7c93b98-cee9-4ca4-af53-0a939fece59b\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.891673 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-config\") pod \"f7c93b98-cee9-4ca4-af53-0a939fece59b\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.891719 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-dns-svc\") pod \"f7c93b98-cee9-4ca4-af53-0a939fece59b\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.891808 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-ovsdbserver-sb\") pod \"f7c93b98-cee9-4ca4-af53-0a939fece59b\" (UID: \"f7c93b98-cee9-4ca4-af53-0a939fece59b\") " Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.897053 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c93b98-cee9-4ca4-af53-0a939fece59b-kube-api-access-588tk" (OuterVolumeSpecName: "kube-api-access-588tk") pod "f7c93b98-cee9-4ca4-af53-0a939fece59b" (UID: "f7c93b98-cee9-4ca4-af53-0a939fece59b"). InnerVolumeSpecName "kube-api-access-588tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.930235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-config" (OuterVolumeSpecName: "config") pod "f7c93b98-cee9-4ca4-af53-0a939fece59b" (UID: "f7c93b98-cee9-4ca4-af53-0a939fece59b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.932501 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f7c93b98-cee9-4ca4-af53-0a939fece59b" (UID: "f7c93b98-cee9-4ca4-af53-0a939fece59b"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.938591 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f7c93b98-cee9-4ca4-af53-0a939fece59b" (UID: "f7c93b98-cee9-4ca4-af53-0a939fece59b"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.944211 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.994037 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-588tk\" (UniqueName: \"kubernetes.io/projected/f7c93b98-cee9-4ca4-af53-0a939fece59b-kube-api-access-588tk\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.994080 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.994089 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:06 crc kubenswrapper[4773]: I0120 18:48:06.994100 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f7c93b98-cee9-4ca4-af53-0a939fece59b-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.321031 4773 generic.go:334] "Generic (PLEG): container finished" podID="2ce8f955-26cb-4860-afc1-effceac1d7a4" containerID="7780978bfbf7070fdc6e2326036f3be82707f66d864695cb582db2f78e403bd9" exitCode=0 Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.321098 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-18ee-account-create-update-llcxn" event={"ID":"2ce8f955-26cb-4860-afc1-effceac1d7a4","Type":"ContainerDied","Data":"7780978bfbf7070fdc6e2326036f3be82707f66d864695cb582db2f78e403bd9"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.321125 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-18ee-account-create-update-llcxn" event={"ID":"2ce8f955-26cb-4860-afc1-effceac1d7a4","Type":"ContainerStarted","Data":"203c8b5e2b96079b41f58f9363695e47c90260d8892717c1920fb47fa147685e"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.323032 4773 generic.go:334] "Generic (PLEG): container finished" podID="c6e707f5-41a8-43c6-976a-7a9645c0b0ca" containerID="bf0e9d7311fa3cf82ace95c60a06b7e7384341e3a3deaf1802d90fe3b93f2f6a" exitCode=0 Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.323214 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fb33-account-create-update-2nkdm" event={"ID":"c6e707f5-41a8-43c6-976a-7a9645c0b0ca","Type":"ContainerDied","Data":"bf0e9d7311fa3cf82ace95c60a06b7e7384341e3a3deaf1802d90fe3b93f2f6a"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.324996 4773 generic.go:334] "Generic (PLEG): container finished" podID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerID="d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea" exitCode=0 Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.325052 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" event={"ID":"f7c93b98-cee9-4ca4-af53-0a939fece59b","Type":"ContainerDied","Data":"d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.325077 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" event={"ID":"f7c93b98-cee9-4ca4-af53-0a939fece59b","Type":"ContainerDied","Data":"0ac79f3c39c06ac9f58156f40c2aaee8557c1dbe4bc8966a0b68c584c793d8d2"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.325097 4773 scope.go:117] "RemoveContainer" containerID="d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.325235 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f896c8c65-g2pwb" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.328200 4773 generic.go:334] "Generic (PLEG): container finished" podID="484e46fc-ebda-496a-9884-295fcd065e9b" containerID="c06ffe1452b6a2d6f74722f0b7b71c4f2ce4d5613dc36070b3f5358e09162f2f" exitCode=0 Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.328252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f756-account-create-update-tlxkm" event={"ID":"484e46fc-ebda-496a-9884-295fcd065e9b","Type":"ContainerDied","Data":"c06ffe1452b6a2d6f74722f0b7b71c4f2ce4d5613dc36070b3f5358e09162f2f"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.330414 4773 generic.go:334] "Generic (PLEG): container finished" podID="0c64cf4d-562e-4a78-a22b-d682436d5db3" containerID="100cf16d899578784656d12cf1cb2bef2afdb76869ab58de6601f1ccfb0932d7" exitCode=0 Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.330466 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8pd22" event={"ID":"0c64cf4d-562e-4a78-a22b-d682436d5db3","Type":"ContainerDied","Data":"100cf16d899578784656d12cf1cb2bef2afdb76869ab58de6601f1ccfb0932d7"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.333924 4773 generic.go:334] "Generic (PLEG): container finished" podID="30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" containerID="2f81f4ca58be86ce8c8a188542774e148445a3fd02682f00bd51696f895c5fe9" exitCode=0 Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.333965 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8bf57" event={"ID":"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea","Type":"ContainerDied","Data":"2f81f4ca58be86ce8c8a188542774e148445a3fd02682f00bd51696f895c5fe9"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.334002 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8bf57" event={"ID":"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea","Type":"ContainerStarted","Data":"cbcc7f45bbe486d6aa7469123e73614cc19e376d6478837580567a98981cf11b"} Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.352259 4773 scope.go:117] "RemoveContainer" containerID="3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.385246 4773 scope.go:117] "RemoveContainer" containerID="d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea" Jan 20 18:48:07 crc kubenswrapper[4773]: E0120 18:48:07.385894 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea\": container with ID starting with d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea not found: ID does not exist" containerID="d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.386005 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea"} err="failed to get container status \"d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea\": rpc error: code = NotFound desc = could not find container \"d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea\": container with ID starting with d3cfa0f49088280d9d475652e75cba42683f348c0175eab4bc89b947fc53c9ea not found: ID does not exist" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.386072 4773 scope.go:117] "RemoveContainer" containerID="3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb" Jan 20 18:48:07 crc kubenswrapper[4773]: E0120 18:48:07.386620 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb\": container with ID starting with 3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb not found: ID does not exist" containerID="3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.386647 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb"} err="failed to get container status \"3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb\": rpc error: code = NotFound desc = could not find container \"3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb\": container with ID starting with 3003e8f2a6f8e43256f25d0dfdf7a30ceb44aa655f3176c78cbaf7efb525a7fb not found: ID does not exist" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.422813 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-g2pwb"] Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.429517 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f896c8c65-g2pwb"] Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.461496 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" path="/var/lib/kubelet/pods/f7c93b98-cee9-4ca4-af53-0a939fece59b/volumes" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.636215 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.808370 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qnkqw\" (UniqueName: \"kubernetes.io/projected/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-kube-api-access-qnkqw\") pod \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.808720 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-operator-scripts\") pod \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\" (UID: \"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb\") " Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.809324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" (UID: "ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.812187 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-kube-api-access-qnkqw" (OuterVolumeSpecName: "kube-api-access-qnkqw") pod "ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" (UID: "ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb"). InnerVolumeSpecName "kube-api-access-qnkqw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.910170 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qnkqw\" (UniqueName: \"kubernetes.io/projected/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-kube-api-access-qnkqw\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:07 crc kubenswrapper[4773]: I0120 18:48:07.910206 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.346833 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-xjqwr" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.346836 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-xjqwr" event={"ID":"ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb","Type":"ContainerDied","Data":"a987ff03a347605bc29a2925c308ab5d8a186e07da55624ddd1c1c8879b773d9"} Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.346892 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a987ff03a347605bc29a2925c308ab5d8a186e07da55624ddd1c1c8879b773d9" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.698954 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.826732 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce8f955-26cb-4860-afc1-effceac1d7a4-operator-scripts\") pod \"2ce8f955-26cb-4860-afc1-effceac1d7a4\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.826791 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cm6g\" (UniqueName: \"kubernetes.io/projected/2ce8f955-26cb-4860-afc1-effceac1d7a4-kube-api-access-9cm6g\") pod \"2ce8f955-26cb-4860-afc1-effceac1d7a4\" (UID: \"2ce8f955-26cb-4860-afc1-effceac1d7a4\") " Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.828088 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ce8f955-26cb-4860-afc1-effceac1d7a4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ce8f955-26cb-4860-afc1-effceac1d7a4" (UID: "2ce8f955-26cb-4860-afc1-effceac1d7a4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.832190 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce8f955-26cb-4860-afc1-effceac1d7a4-kube-api-access-9cm6g" (OuterVolumeSpecName: "kube-api-access-9cm6g") pod "2ce8f955-26cb-4860-afc1-effceac1d7a4" (UID: "2ce8f955-26cb-4860-afc1-effceac1d7a4"). InnerVolumeSpecName "kube-api-access-9cm6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.885952 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.891235 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8bf57" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.907877 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.915614 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8pd22" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.929127 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ce8f955-26cb-4860-afc1-effceac1d7a4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:08 crc kubenswrapper[4773]: I0120 18:48:08.929158 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cm6g\" (UniqueName: \"kubernetes.io/projected/2ce8f955-26cb-4860-afc1-effceac1d7a4-kube-api-access-9cm6g\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.029858 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wl85\" (UniqueName: \"kubernetes.io/projected/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-kube-api-access-7wl85\") pod \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.029922 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-operator-scripts\") pod \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.029983 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdgx7\" (UniqueName: \"kubernetes.io/projected/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-kube-api-access-fdgx7\") pod \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\" (UID: \"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030057 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nqzds\" (UniqueName: \"kubernetes.io/projected/0c64cf4d-562e-4a78-a22b-d682436d5db3-kube-api-access-nqzds\") pod \"0c64cf4d-562e-4a78-a22b-d682436d5db3\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030092 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484e46fc-ebda-496a-9884-295fcd065e9b-operator-scripts\") pod \"484e46fc-ebda-496a-9884-295fcd065e9b\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030124 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64cf4d-562e-4a78-a22b-d682436d5db3-operator-scripts\") pod \"0c64cf4d-562e-4a78-a22b-d682436d5db3\" (UID: \"0c64cf4d-562e-4a78-a22b-d682436d5db3\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030144 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-operator-scripts\") pod \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\" (UID: \"c6e707f5-41a8-43c6-976a-7a9645c0b0ca\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030186 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p9d9d\" (UniqueName: \"kubernetes.io/projected/484e46fc-ebda-496a-9884-295fcd065e9b-kube-api-access-p9d9d\") pod \"484e46fc-ebda-496a-9884-295fcd065e9b\" (UID: \"484e46fc-ebda-496a-9884-295fcd065e9b\") " Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030826 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/484e46fc-ebda-496a-9884-295fcd065e9b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "484e46fc-ebda-496a-9884-295fcd065e9b" (UID: "484e46fc-ebda-496a-9884-295fcd065e9b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.030844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" (UID: "30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.031081 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c64cf4d-562e-4a78-a22b-d682436d5db3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0c64cf4d-562e-4a78-a22b-d682436d5db3" (UID: "0c64cf4d-562e-4a78-a22b-d682436d5db3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.031217 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c6e707f5-41a8-43c6-976a-7a9645c0b0ca" (UID: "c6e707f5-41a8-43c6-976a-7a9645c0b0ca"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.033593 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c64cf4d-562e-4a78-a22b-d682436d5db3-kube-api-access-nqzds" (OuterVolumeSpecName: "kube-api-access-nqzds") pod "0c64cf4d-562e-4a78-a22b-d682436d5db3" (UID: "0c64cf4d-562e-4a78-a22b-d682436d5db3"). InnerVolumeSpecName "kube-api-access-nqzds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.033816 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-kube-api-access-7wl85" (OuterVolumeSpecName: "kube-api-access-7wl85") pod "c6e707f5-41a8-43c6-976a-7a9645c0b0ca" (UID: "c6e707f5-41a8-43c6-976a-7a9645c0b0ca"). InnerVolumeSpecName "kube-api-access-7wl85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.034056 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-kube-api-access-fdgx7" (OuterVolumeSpecName: "kube-api-access-fdgx7") pod "30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" (UID: "30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea"). InnerVolumeSpecName "kube-api-access-fdgx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.035714 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484e46fc-ebda-496a-9884-295fcd065e9b-kube-api-access-p9d9d" (OuterVolumeSpecName: "kube-api-access-p9d9d") pod "484e46fc-ebda-496a-9884-295fcd065e9b" (UID: "484e46fc-ebda-496a-9884-295fcd065e9b"). InnerVolumeSpecName "kube-api-access-p9d9d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132722 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0c64cf4d-562e-4a78-a22b-d682436d5db3-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132800 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132822 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p9d9d\" (UniqueName: \"kubernetes.io/projected/484e46fc-ebda-496a-9884-295fcd065e9b-kube-api-access-p9d9d\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132843 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wl85\" (UniqueName: \"kubernetes.io/projected/c6e707f5-41a8-43c6-976a-7a9645c0b0ca-kube-api-access-7wl85\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132907 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132952 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdgx7\" (UniqueName: \"kubernetes.io/projected/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea-kube-api-access-fdgx7\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132970 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nqzds\" (UniqueName: \"kubernetes.io/projected/0c64cf4d-562e-4a78-a22b-d682436d5db3-kube-api-access-nqzds\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.132986 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/484e46fc-ebda-496a-9884-295fcd065e9b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.359096 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-fb33-account-create-update-2nkdm" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.359097 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-fb33-account-create-update-2nkdm" event={"ID":"c6e707f5-41a8-43c6-976a-7a9645c0b0ca","Type":"ContainerDied","Data":"c35c88e6b8f82028f890aa0f52430d327b2c4c64fd2b524a32b59197f9562841"} Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.359379 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c35c88e6b8f82028f890aa0f52430d327b2c4c64fd2b524a32b59197f9562841" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.361119 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-f756-account-create-update-tlxkm" event={"ID":"484e46fc-ebda-496a-9884-295fcd065e9b","Type":"ContainerDied","Data":"6c0ddb1490bd747f1bba344e9687839219cf6cfc969f7344d057f8811585e3e5"} Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.361170 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c0ddb1490bd747f1bba344e9687839219cf6cfc969f7344d057f8811585e3e5" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.361327 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-f756-account-create-update-tlxkm" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.365685 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-8pd22" event={"ID":"0c64cf4d-562e-4a78-a22b-d682436d5db3","Type":"ContainerDied","Data":"e1ec142f5c3ab6fc72698265a83e114f22a335151038c0a5d6cead32470d2827"} Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.365718 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e1ec142f5c3ab6fc72698265a83e114f22a335151038c0a5d6cead32470d2827" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.365772 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-8pd22" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.375001 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8bf57" event={"ID":"30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea","Type":"ContainerDied","Data":"cbcc7f45bbe486d6aa7469123e73614cc19e376d6478837580567a98981cf11b"} Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.375041 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbcc7f45bbe486d6aa7469123e73614cc19e376d6478837580567a98981cf11b" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.375047 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8bf57" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.376261 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-18ee-account-create-update-llcxn" event={"ID":"2ce8f955-26cb-4860-afc1-effceac1d7a4","Type":"ContainerDied","Data":"203c8b5e2b96079b41f58f9363695e47c90260d8892717c1920fb47fa147685e"} Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.376296 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="203c8b5e2b96079b41f58f9363695e47c90260d8892717c1920fb47fa147685e" Jan 20 18:48:09 crc kubenswrapper[4773]: I0120 18:48:09.376357 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-18ee-account-create-update-llcxn" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.860473 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-29z4h"] Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861114 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerName="init" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861128 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerName="init" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861137 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861144 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861168 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ce796025-4e2f-439c-9fab-20c8295a792c" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861175 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce796025-4e2f-439c-9fab-20c8295a792c" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861199 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861205 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861215 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="484e46fc-ebda-496a-9884-295fcd065e9b" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861221 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="484e46fc-ebda-496a-9884-295fcd065e9b" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861228 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce8f955-26cb-4860-afc1-effceac1d7a4" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861234 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce8f955-26cb-4860-afc1-effceac1d7a4" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861249 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6e707f5-41a8-43c6-976a-7a9645c0b0ca" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861255 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6e707f5-41a8-43c6-976a-7a9645c0b0ca" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861270 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerName="dnsmasq-dns" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861275 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerName="dnsmasq-dns" Jan 20 18:48:10 crc kubenswrapper[4773]: E0120 18:48:10.861285 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0c64cf4d-562e-4a78-a22b-d682436d5db3" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861290 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c64cf4d-562e-4a78-a22b-d682436d5db3" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861433 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c93b98-cee9-4ca4-af53-0a939fece59b" containerName="dnsmasq-dns" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861448 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce8f955-26cb-4860-afc1-effceac1d7a4" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861458 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="484e46fc-ebda-496a-9884-295fcd065e9b" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861466 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861474 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6e707f5-41a8-43c6-976a-7a9645c0b0ca" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861482 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ce796025-4e2f-439c-9fab-20c8295a792c" containerName="mariadb-account-create-update" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861492 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.861504 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0c64cf4d-562e-4a78-a22b-d682436d5db3" containerName="mariadb-database-create" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.862034 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.864313 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9vtkh" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.866049 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.869272 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-29z4h"] Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.959551 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-combined-ca-bundle\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.959606 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-config-data\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.959625 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxckd\" (UniqueName: \"kubernetes.io/projected/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-kube-api-access-nxckd\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:10 crc kubenswrapper[4773]: I0120 18:48:10.960010 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-db-sync-config-data\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.061561 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-config-data\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.061650 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxckd\" (UniqueName: \"kubernetes.io/projected/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-kube-api-access-nxckd\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.061833 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-db-sync-config-data\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.062194 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-combined-ca-bundle\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.065881 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-db-sync-config-data\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.068498 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-config-data\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.078782 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-combined-ca-bundle\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.083573 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxckd\" (UniqueName: \"kubernetes.io/projected/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-kube-api-access-nxckd\") pod \"glance-db-sync-29z4h\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.180295 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:11 crc kubenswrapper[4773]: I0120 18:48:11.658940 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-29z4h"] Jan 20 18:48:11 crc kubenswrapper[4773]: W0120 18:48:11.672233 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaa7530e2_53e5_4891_9a0e_ff23ee1c61bc.slice/crio-dbc468a45d9d73356dc4552388c461ed7acca337b6f3cf6b0fd22751fd3315c5 WatchSource:0}: Error finding container dbc468a45d9d73356dc4552388c461ed7acca337b6f3cf6b0fd22751fd3315c5: Status 404 returned error can't find the container with id dbc468a45d9d73356dc4552388c461ed7acca337b6f3cf6b0fd22751fd3315c5 Jan 20 18:48:12 crc kubenswrapper[4773]: I0120 18:48:12.396989 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-29z4h" event={"ID":"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc","Type":"ContainerStarted","Data":"dbc468a45d9d73356dc4552388c461ed7acca337b6f3cf6b0fd22751fd3315c5"} Jan 20 18:48:12 crc kubenswrapper[4773]: I0120 18:48:12.927997 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.534373 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-dnds2"] Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.540746 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-dnds2"] Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.613041 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-j46db"] Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.614886 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.618175 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.622048 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j46db"] Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.712146 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5455e9-7072-4154-b881-75a1da2c0466-operator-scripts\") pod \"root-account-create-update-j46db\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.712221 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fq7x\" (UniqueName: \"kubernetes.io/projected/7f5455e9-7072-4154-b881-75a1da2c0466-kube-api-access-6fq7x\") pod \"root-account-create-update-j46db\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.814092 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5455e9-7072-4154-b881-75a1da2c0466-operator-scripts\") pod \"root-account-create-update-j46db\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.814173 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fq7x\" (UniqueName: \"kubernetes.io/projected/7f5455e9-7072-4154-b881-75a1da2c0466-kube-api-access-6fq7x\") pod \"root-account-create-update-j46db\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.814974 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5455e9-7072-4154-b881-75a1da2c0466-operator-scripts\") pod \"root-account-create-update-j46db\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.832743 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fq7x\" (UniqueName: \"kubernetes.io/projected/7f5455e9-7072-4154-b881-75a1da2c0466-kube-api-access-6fq7x\") pod \"root-account-create-update-j46db\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " pod="openstack/root-account-create-update-j46db" Jan 20 18:48:13 crc kubenswrapper[4773]: I0120 18:48:13.933598 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j46db" Jan 20 18:48:14 crc kubenswrapper[4773]: I0120 18:48:14.342006 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-j46db"] Jan 20 18:48:14 crc kubenswrapper[4773]: I0120 18:48:14.414531 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j46db" event={"ID":"7f5455e9-7072-4154-b881-75a1da2c0466","Type":"ContainerStarted","Data":"428521eea85e77dc9f56997d37b1956f83674b7ba962c44955ed01dddfe8cfbe"} Jan 20 18:48:15 crc kubenswrapper[4773]: I0120 18:48:15.421801 4773 generic.go:334] "Generic (PLEG): container finished" podID="7f5455e9-7072-4154-b881-75a1da2c0466" containerID="7ac50ea7174c7d5687e310f246288e88881d9a99f2dc7333211966358f9e13de" exitCode=0 Jan 20 18:48:15 crc kubenswrapper[4773]: I0120 18:48:15.421969 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j46db" event={"ID":"7f5455e9-7072-4154-b881-75a1da2c0466","Type":"ContainerDied","Data":"7ac50ea7174c7d5687e310f246288e88881d9a99f2dc7333211966358f9e13de"} Jan 20 18:48:15 crc kubenswrapper[4773]: I0120 18:48:15.455738 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce796025-4e2f-439c-9fab-20c8295a792c" path="/var/lib/kubelet/pods/ce796025-4e2f-439c-9fab-20c8295a792c/volumes" Jan 20 18:48:21 crc kubenswrapper[4773]: I0120 18:48:21.187100 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-t5h8j" podUID="2fce4eb9-f614-4050-a099-0a743695dcd9" containerName="ovn-controller" probeResult="failure" output=< Jan 20 18:48:21 crc kubenswrapper[4773]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 20 18:48:21 crc kubenswrapper[4773]: > Jan 20 18:48:21 crc kubenswrapper[4773]: I0120 18:48:21.487464 4773 generic.go:334] "Generic (PLEG): container finished" podID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerID="1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f" exitCode=0 Jan 20 18:48:21 crc kubenswrapper[4773]: I0120 18:48:21.487509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d4dfff97-df7d-498f-9203-9c2cb0d84667","Type":"ContainerDied","Data":"1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f"} Jan 20 18:48:21 crc kubenswrapper[4773]: I0120 18:48:21.492134 4773 generic.go:334] "Generic (PLEG): container finished" podID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerID="582308e76cf42821e4ae7402e4d5fe864dee8caf26b6d7f6b99985263eaa82fb" exitCode=0 Jan 20 18:48:21 crc kubenswrapper[4773]: I0120 18:48:21.492179 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b357137a-6e30-4ed9-a440-c9f3e90f75d8","Type":"ContainerDied","Data":"582308e76cf42821e4ae7402e4d5fe864dee8caf26b6d7f6b99985263eaa82fb"} Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.360710 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j46db" Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.458773 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5455e9-7072-4154-b881-75a1da2c0466-operator-scripts\") pod \"7f5455e9-7072-4154-b881-75a1da2c0466\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.458840 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6fq7x\" (UniqueName: \"kubernetes.io/projected/7f5455e9-7072-4154-b881-75a1da2c0466-kube-api-access-6fq7x\") pod \"7f5455e9-7072-4154-b881-75a1da2c0466\" (UID: \"7f5455e9-7072-4154-b881-75a1da2c0466\") " Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.459568 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f5455e9-7072-4154-b881-75a1da2c0466-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f5455e9-7072-4154-b881-75a1da2c0466" (UID: "7f5455e9-7072-4154-b881-75a1da2c0466"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.462474 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f5455e9-7072-4154-b881-75a1da2c0466-kube-api-access-6fq7x" (OuterVolumeSpecName: "kube-api-access-6fq7x") pod "7f5455e9-7072-4154-b881-75a1da2c0466" (UID: "7f5455e9-7072-4154-b881-75a1da2c0466"). InnerVolumeSpecName "kube-api-access-6fq7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.499799 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-j46db" event={"ID":"7f5455e9-7072-4154-b881-75a1da2c0466","Type":"ContainerDied","Data":"428521eea85e77dc9f56997d37b1956f83674b7ba962c44955ed01dddfe8cfbe"} Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.499836 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="428521eea85e77dc9f56997d37b1956f83674b7ba962c44955ed01dddfe8cfbe" Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.499890 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-j46db" Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.560609 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f5455e9-7072-4154-b881-75a1da2c0466-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:22 crc kubenswrapper[4773]: I0120 18:48:22.560644 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6fq7x\" (UniqueName: \"kubernetes.io/projected/7f5455e9-7072-4154-b881-75a1da2c0466-kube-api-access-6fq7x\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.508493 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-29z4h" event={"ID":"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc","Type":"ContainerStarted","Data":"80a1ac29f0f0bb537005b053ab8c7c220780e4401eea32b234572cee7902d616"} Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.512341 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d4dfff97-df7d-498f-9203-9c2cb0d84667","Type":"ContainerStarted","Data":"c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647"} Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.512613 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.514902 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b357137a-6e30-4ed9-a440-c9f3e90f75d8","Type":"ContainerStarted","Data":"7ff480cee767ffce07ac3630dc103e93b87e75d3f6036d288b00e3da958c2897"} Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.515196 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.527758 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-29z4h" podStartSLOduration=2.776939087 podStartE2EDuration="13.527738697s" podCreationTimestamp="2026-01-20 18:48:10 +0000 UTC" firstStartedPulling="2026-01-20 18:48:11.673777718 +0000 UTC m=+1084.595590752" lastFinishedPulling="2026-01-20 18:48:22.424577338 +0000 UTC m=+1095.346390362" observedRunningTime="2026-01-20 18:48:23.524633813 +0000 UTC m=+1096.446446847" watchObservedRunningTime="2026-01-20 18:48:23.527738697 +0000 UTC m=+1096.449551721" Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.562725 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=40.483973342 podStartE2EDuration="1m13.5627014s" podCreationTimestamp="2026-01-20 18:47:10 +0000 UTC" firstStartedPulling="2026-01-20 18:47:13.34618256 +0000 UTC m=+1026.267995584" lastFinishedPulling="2026-01-20 18:47:46.424910618 +0000 UTC m=+1059.346723642" observedRunningTime="2026-01-20 18:48:23.549705186 +0000 UTC m=+1096.471518211" watchObservedRunningTime="2026-01-20 18:48:23.5627014 +0000 UTC m=+1096.484514434" Jan 20 18:48:23 crc kubenswrapper[4773]: I0120 18:48:23.591209 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=40.986218316 podStartE2EDuration="1m13.591193206s" podCreationTimestamp="2026-01-20 18:47:10 +0000 UTC" firstStartedPulling="2026-01-20 18:47:13.001888801 +0000 UTC m=+1025.923701825" lastFinishedPulling="2026-01-20 18:47:45.606863691 +0000 UTC m=+1058.528676715" observedRunningTime="2026-01-20 18:48:23.574225537 +0000 UTC m=+1096.496038581" watchObservedRunningTime="2026-01-20 18:48:23.591193206 +0000 UTC m=+1096.513006230" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.188611 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-t5h8j" podUID="2fce4eb9-f614-4050-a099-0a743695dcd9" containerName="ovn-controller" probeResult="failure" output=< Jan 20 18:48:26 crc kubenswrapper[4773]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Jan 20 18:48:26 crc kubenswrapper[4773]: > Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.209151 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.210792 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-5gcvm" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.453441 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-t5h8j-config-47pnf"] Jan 20 18:48:26 crc kubenswrapper[4773]: E0120 18:48:26.453847 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f5455e9-7072-4154-b881-75a1da2c0466" containerName="mariadb-account-create-update" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.453873 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f5455e9-7072-4154-b881-75a1da2c0466" containerName="mariadb-account-create-update" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.454082 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f5455e9-7072-4154-b881-75a1da2c0466" containerName="mariadb-account-create-update" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.454663 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.456612 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.469393 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t5h8j-config-47pnf"] Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.536300 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-log-ovn\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.536409 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run-ovn\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.536440 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-scripts\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.536465 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bs7l\" (UniqueName: \"kubernetes.io/projected/53283907-9a5d-4568-9db6-bce4357ad6a4-kube-api-access-5bs7l\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.536592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.536616 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-additional-scripts\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638524 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run-ovn\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638572 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-scripts\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5bs7l\" (UniqueName: \"kubernetes.io/projected/53283907-9a5d-4568-9db6-bce4357ad6a4-kube-api-access-5bs7l\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638635 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638653 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-additional-scripts\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638694 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-log-ovn\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.638984 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-log-ovn\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.639043 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run-ovn\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.639077 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.640026 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-additional-scripts\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.641029 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-scripts\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.659811 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bs7l\" (UniqueName: \"kubernetes.io/projected/53283907-9a5d-4568-9db6-bce4357ad6a4-kube-api-access-5bs7l\") pod \"ovn-controller-t5h8j-config-47pnf\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:26 crc kubenswrapper[4773]: I0120 18:48:26.774828 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:27 crc kubenswrapper[4773]: W0120 18:48:27.020225 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod53283907_9a5d_4568_9db6_bce4357ad6a4.slice/crio-bedf19bea214a6e33f9e7283840fe30769ac34ca9b55fac39be5a973b29e2379 WatchSource:0}: Error finding container bedf19bea214a6e33f9e7283840fe30769ac34ca9b55fac39be5a973b29e2379: Status 404 returned error can't find the container with id bedf19bea214a6e33f9e7283840fe30769ac34ca9b55fac39be5a973b29e2379 Jan 20 18:48:27 crc kubenswrapper[4773]: I0120 18:48:27.030538 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t5h8j-config-47pnf"] Jan 20 18:48:27 crc kubenswrapper[4773]: I0120 18:48:27.545888 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-47pnf" event={"ID":"53283907-9a5d-4568-9db6-bce4357ad6a4","Type":"ContainerStarted","Data":"ad751c184e2d23886c2618122268c91712c2e78962210fabf377095ff3332826"} Jan 20 18:48:27 crc kubenswrapper[4773]: I0120 18:48:27.547070 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-47pnf" event={"ID":"53283907-9a5d-4568-9db6-bce4357ad6a4","Type":"ContainerStarted","Data":"bedf19bea214a6e33f9e7283840fe30769ac34ca9b55fac39be5a973b29e2379"} Jan 20 18:48:27 crc kubenswrapper[4773]: I0120 18:48:27.563856 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-t5h8j-config-47pnf" podStartSLOduration=1.563834946 podStartE2EDuration="1.563834946s" podCreationTimestamp="2026-01-20 18:48:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:27.558429697 +0000 UTC m=+1100.480242721" watchObservedRunningTime="2026-01-20 18:48:27.563834946 +0000 UTC m=+1100.485647970" Jan 20 18:48:28 crc kubenswrapper[4773]: I0120 18:48:28.554217 4773 generic.go:334] "Generic (PLEG): container finished" podID="53283907-9a5d-4568-9db6-bce4357ad6a4" containerID="ad751c184e2d23886c2618122268c91712c2e78962210fabf377095ff3332826" exitCode=0 Jan 20 18:48:28 crc kubenswrapper[4773]: I0120 18:48:28.554312 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-47pnf" event={"ID":"53283907-9a5d-4568-9db6-bce4357ad6a4","Type":"ContainerDied","Data":"ad751c184e2d23886c2618122268c91712c2e78962210fabf377095ff3332826"} Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.864165 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990132 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run-ovn\") pod \"53283907-9a5d-4568-9db6-bce4357ad6a4\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990239 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "53283907-9a5d-4568-9db6-bce4357ad6a4" (UID: "53283907-9a5d-4568-9db6-bce4357ad6a4"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990267 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run\") pod \"53283907-9a5d-4568-9db6-bce4357ad6a4\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990318 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run" (OuterVolumeSpecName: "var-run") pod "53283907-9a5d-4568-9db6-bce4357ad6a4" (UID: "53283907-9a5d-4568-9db6-bce4357ad6a4"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990424 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-additional-scripts\") pod \"53283907-9a5d-4568-9db6-bce4357ad6a4\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990472 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-log-ovn\") pod \"53283907-9a5d-4568-9db6-bce4357ad6a4\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990562 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-scripts\") pod \"53283907-9a5d-4568-9db6-bce4357ad6a4\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990592 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bs7l\" (UniqueName: \"kubernetes.io/projected/53283907-9a5d-4568-9db6-bce4357ad6a4-kube-api-access-5bs7l\") pod \"53283907-9a5d-4568-9db6-bce4357ad6a4\" (UID: \"53283907-9a5d-4568-9db6-bce4357ad6a4\") " Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.990593 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "53283907-9a5d-4568-9db6-bce4357ad6a4" (UID: "53283907-9a5d-4568-9db6-bce4357ad6a4"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.991266 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "53283907-9a5d-4568-9db6-bce4357ad6a4" (UID: "53283907-9a5d-4568-9db6-bce4357ad6a4"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.991361 4773 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.991383 4773 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.991398 4773 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.991408 4773 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/53283907-9a5d-4568-9db6-bce4357ad6a4-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.991478 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-scripts" (OuterVolumeSpecName: "scripts") pod "53283907-9a5d-4568-9db6-bce4357ad6a4" (UID: "53283907-9a5d-4568-9db6-bce4357ad6a4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:29 crc kubenswrapper[4773]: I0120 18:48:29.998299 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53283907-9a5d-4568-9db6-bce4357ad6a4-kube-api-access-5bs7l" (OuterVolumeSpecName: "kube-api-access-5bs7l") pod "53283907-9a5d-4568-9db6-bce4357ad6a4" (UID: "53283907-9a5d-4568-9db6-bce4357ad6a4"). InnerVolumeSpecName "kube-api-access-5bs7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.093030 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/53283907-9a5d-4568-9db6-bce4357ad6a4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.093063 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5bs7l\" (UniqueName: \"kubernetes.io/projected/53283907-9a5d-4568-9db6-bce4357ad6a4-kube-api-access-5bs7l\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.569711 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-47pnf" event={"ID":"53283907-9a5d-4568-9db6-bce4357ad6a4","Type":"ContainerDied","Data":"bedf19bea214a6e33f9e7283840fe30769ac34ca9b55fac39be5a973b29e2379"} Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.569759 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bedf19bea214a6e33f9e7283840fe30769ac34ca9b55fac39be5a973b29e2379" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.569788 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-47pnf" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.594724 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-t5h8j-config-47pnf"] Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.602059 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-t5h8j-config-47pnf"] Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.693574 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-t5h8j-config-5522t"] Jan 20 18:48:30 crc kubenswrapper[4773]: E0120 18:48:30.693867 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53283907-9a5d-4568-9db6-bce4357ad6a4" containerName="ovn-config" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.693884 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="53283907-9a5d-4568-9db6-bce4357ad6a4" containerName="ovn-config" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.694073 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="53283907-9a5d-4568-9db6-bce4357ad6a4" containerName="ovn-config" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.695083 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.697963 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.709323 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t5h8j-config-5522t"] Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.801400 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-additional-scripts\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.801515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwfbs\" (UniqueName: \"kubernetes.io/projected/5bb055c2-7ce4-425c-8e65-df8438bde346-kube-api-access-vwfbs\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.801546 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-scripts\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.801607 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run-ovn\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.801636 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.801661 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-log-ovn\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903092 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-log-ovn\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-additional-scripts\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903195 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwfbs\" (UniqueName: \"kubernetes.io/projected/5bb055c2-7ce4-425c-8e65-df8438bde346-kube-api-access-vwfbs\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903218 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-scripts\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903263 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run-ovn\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903468 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.903529 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run-ovn\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.904372 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-additional-scripts\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.904442 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-log-ovn\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.905650 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-scripts\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:30 crc kubenswrapper[4773]: I0120 18:48:30.921186 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwfbs\" (UniqueName: \"kubernetes.io/projected/5bb055c2-7ce4-425c-8e65-df8438bde346-kube-api-access-vwfbs\") pod \"ovn-controller-t5h8j-config-5522t\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:31 crc kubenswrapper[4773]: I0120 18:48:31.010293 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:31 crc kubenswrapper[4773]: I0120 18:48:31.205619 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-t5h8j" Jan 20 18:48:31 crc kubenswrapper[4773]: I0120 18:48:31.465134 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53283907-9a5d-4568-9db6-bce4357ad6a4" path="/var/lib/kubelet/pods/53283907-9a5d-4568-9db6-bce4357ad6a4/volumes" Jan 20 18:48:31 crc kubenswrapper[4773]: I0120 18:48:31.483216 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-t5h8j-config-5522t"] Jan 20 18:48:31 crc kubenswrapper[4773]: I0120 18:48:31.577195 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-5522t" event={"ID":"5bb055c2-7ce4-425c-8e65-df8438bde346","Type":"ContainerStarted","Data":"abad885cb9fdc1000de82a7a346e1b71acb34383bdf5e8369f0fed08971ac5b2"} Jan 20 18:48:32 crc kubenswrapper[4773]: I0120 18:48:32.584588 4773 generic.go:334] "Generic (PLEG): container finished" podID="5bb055c2-7ce4-425c-8e65-df8438bde346" containerID="fba20ba934791c753f7de5893c3aaef399510fc1a1206ee1163905e05a43e6b4" exitCode=0 Jan 20 18:48:32 crc kubenswrapper[4773]: I0120 18:48:32.584633 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-5522t" event={"ID":"5bb055c2-7ce4-425c-8e65-df8438bde346","Type":"ContainerDied","Data":"fba20ba934791c753f7de5893c3aaef399510fc1a1206ee1163905e05a43e6b4"} Jan 20 18:48:32 crc kubenswrapper[4773]: I0120 18:48:32.619361 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:48:32 crc kubenswrapper[4773]: I0120 18:48:32.629095 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 20 18:48:33 crc kubenswrapper[4773]: I0120 18:48:33.592663 4773 generic.go:334] "Generic (PLEG): container finished" podID="aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" containerID="80a1ac29f0f0bb537005b053ab8c7c220780e4401eea32b234572cee7902d616" exitCode=0 Jan 20 18:48:33 crc kubenswrapper[4773]: I0120 18:48:33.592760 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-29z4h" event={"ID":"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc","Type":"ContainerDied","Data":"80a1ac29f0f0bb537005b053ab8c7c220780e4401eea32b234572cee7902d616"} Jan 20 18:48:33 crc kubenswrapper[4773]: I0120 18:48:33.901968 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072055 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run\") pod \"5bb055c2-7ce4-425c-8e65-df8438bde346\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072145 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run-ovn\") pod \"5bb055c2-7ce4-425c-8e65-df8438bde346\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072176 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run" (OuterVolumeSpecName: "var-run") pod "5bb055c2-7ce4-425c-8e65-df8438bde346" (UID: "5bb055c2-7ce4-425c-8e65-df8438bde346"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072240 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwfbs\" (UniqueName: \"kubernetes.io/projected/5bb055c2-7ce4-425c-8e65-df8438bde346-kube-api-access-vwfbs\") pod \"5bb055c2-7ce4-425c-8e65-df8438bde346\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072282 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-log-ovn\") pod \"5bb055c2-7ce4-425c-8e65-df8438bde346\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072315 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-scripts\") pod \"5bb055c2-7ce4-425c-8e65-df8438bde346\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072337 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "5bb055c2-7ce4-425c-8e65-df8438bde346" (UID: "5bb055c2-7ce4-425c-8e65-df8438bde346"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072364 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-additional-scripts\") pod \"5bb055c2-7ce4-425c-8e65-df8438bde346\" (UID: \"5bb055c2-7ce4-425c-8e65-df8438bde346\") " Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072403 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "5bb055c2-7ce4-425c-8e65-df8438bde346" (UID: "5bb055c2-7ce4-425c-8e65-df8438bde346"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072760 4773 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072783 4773 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.072794 4773 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5bb055c2-7ce4-425c-8e65-df8438bde346-var-log-ovn\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.073250 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "5bb055c2-7ce4-425c-8e65-df8438bde346" (UID: "5bb055c2-7ce4-425c-8e65-df8438bde346"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.073811 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-scripts" (OuterVolumeSpecName: "scripts") pod "5bb055c2-7ce4-425c-8e65-df8438bde346" (UID: "5bb055c2-7ce4-425c-8e65-df8438bde346"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.080059 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bb055c2-7ce4-425c-8e65-df8438bde346-kube-api-access-vwfbs" (OuterVolumeSpecName: "kube-api-access-vwfbs") pod "5bb055c2-7ce4-425c-8e65-df8438bde346" (UID: "5bb055c2-7ce4-425c-8e65-df8438bde346"). InnerVolumeSpecName "kube-api-access-vwfbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.174063 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwfbs\" (UniqueName: \"kubernetes.io/projected/5bb055c2-7ce4-425c-8e65-df8438bde346-kube-api-access-vwfbs\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.174375 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.174385 4773 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/5bb055c2-7ce4-425c-8e65-df8438bde346-additional-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.381179 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-jqhz4"] Jan 20 18:48:34 crc kubenswrapper[4773]: E0120 18:48:34.381586 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5bb055c2-7ce4-425c-8e65-df8438bde346" containerName="ovn-config" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.381609 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bb055c2-7ce4-425c-8e65-df8438bde346" containerName="ovn-config" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.381813 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5bb055c2-7ce4-425c-8e65-df8438bde346" containerName="ovn-config" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.382470 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.391814 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jqhz4"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.478865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be215ecb-8014-4db1-8eac-59f0d3dee870-operator-scripts\") pod \"cinder-db-create-jqhz4\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.479184 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcrjq\" (UniqueName: \"kubernetes.io/projected/be215ecb-8014-4db1-8eac-59f0d3dee870-kube-api-access-hcrjq\") pod \"cinder-db-create-jqhz4\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.494864 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-fldlp"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.495850 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.518960 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-b6ae-account-create-update-xdwz2"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.520246 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.525615 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.529792 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fldlp"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.555663 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b6ae-account-create-update-xdwz2"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.580805 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hcrjq\" (UniqueName: \"kubernetes.io/projected/be215ecb-8014-4db1-8eac-59f0d3dee870-kube-api-access-hcrjq\") pod \"cinder-db-create-jqhz4\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.580887 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be215ecb-8014-4db1-8eac-59f0d3dee870-operator-scripts\") pod \"cinder-db-create-jqhz4\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.580922 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d813dade-efd1-404d-ae3f-ecea71ffb5ee-operator-scripts\") pod \"barbican-db-create-fldlp\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.580975 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lwph\" (UniqueName: \"kubernetes.io/projected/d813dade-efd1-404d-ae3f-ecea71ffb5ee-kube-api-access-2lwph\") pod \"barbican-db-create-fldlp\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.582029 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be215ecb-8014-4db1-8eac-59f0d3dee870-operator-scripts\") pod \"cinder-db-create-jqhz4\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.616022 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcrjq\" (UniqueName: \"kubernetes.io/projected/be215ecb-8014-4db1-8eac-59f0d3dee870-kube-api-access-hcrjq\") pod \"cinder-db-create-jqhz4\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.649569 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-58b0-account-create-update-n4bl6"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.650641 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.656224 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.658046 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-t5h8j-config-5522t" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.658097 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-t5h8j-config-5522t" event={"ID":"5bb055c2-7ce4-425c-8e65-df8438bde346","Type":"ContainerDied","Data":"abad885cb9fdc1000de82a7a346e1b71acb34383bdf5e8369f0fed08971ac5b2"} Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.658134 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abad885cb9fdc1000de82a7a346e1b71acb34383bdf5e8369f0fed08971ac5b2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.665592 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-58b0-account-create-update-n4bl6"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.682697 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b313ef44-3ec0-4e2e-bc88-0187cce26783-operator-scripts\") pod \"barbican-b6ae-account-create-update-xdwz2\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.682758 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d813dade-efd1-404d-ae3f-ecea71ffb5ee-operator-scripts\") pod \"barbican-db-create-fldlp\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.682790 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdx2d\" (UniqueName: \"kubernetes.io/projected/b313ef44-3ec0-4e2e-bc88-0187cce26783-kube-api-access-tdx2d\") pod \"barbican-b6ae-account-create-update-xdwz2\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.682812 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2lwph\" (UniqueName: \"kubernetes.io/projected/d813dade-efd1-404d-ae3f-ecea71ffb5ee-kube-api-access-2lwph\") pod \"barbican-db-create-fldlp\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.683669 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d813dade-efd1-404d-ae3f-ecea71ffb5ee-operator-scripts\") pod \"barbican-db-create-fldlp\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.698906 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.723744 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lwph\" (UniqueName: \"kubernetes.io/projected/d813dade-efd1-404d-ae3f-ecea71ffb5ee-kube-api-access-2lwph\") pod \"barbican-db-create-fldlp\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.784340 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b313ef44-3ec0-4e2e-bc88-0187cce26783-operator-scripts\") pod \"barbican-b6ae-account-create-update-xdwz2\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.784673 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181581ac-d6d3-4700-bfb7-7179a262a27c-operator-scripts\") pod \"cinder-58b0-account-create-update-n4bl6\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.784733 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdx2d\" (UniqueName: \"kubernetes.io/projected/b313ef44-3ec0-4e2e-bc88-0187cce26783-kube-api-access-tdx2d\") pod \"barbican-b6ae-account-create-update-xdwz2\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.784777 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqfjv\" (UniqueName: \"kubernetes.io/projected/181581ac-d6d3-4700-bfb7-7179a262a27c-kube-api-access-jqfjv\") pod \"cinder-58b0-account-create-update-n4bl6\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.785440 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b313ef44-3ec0-4e2e-bc88-0187cce26783-operator-scripts\") pod \"barbican-b6ae-account-create-update-xdwz2\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.800908 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-kmlg7"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.801900 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.805818 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.806442 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.806553 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-24qqg" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.806749 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.806761 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kmlg7"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.818219 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.824391 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdx2d\" (UniqueName: \"kubernetes.io/projected/b313ef44-3ec0-4e2e-bc88-0187cce26783-kube-api-access-tdx2d\") pod \"barbican-b6ae-account-create-update-xdwz2\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.858326 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.888401 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4p5kk\" (UniqueName: \"kubernetes.io/projected/49d41a48-da79-4b93-bf84-ab8b94fed1c1-kube-api-access-4p5kk\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.888672 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-combined-ca-bundle\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.888726 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqfjv\" (UniqueName: \"kubernetes.io/projected/181581ac-d6d3-4700-bfb7-7179a262a27c-kube-api-access-jqfjv\") pod \"cinder-58b0-account-create-update-n4bl6\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.888746 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-config-data\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.888816 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181581ac-d6d3-4700-bfb7-7179a262a27c-operator-scripts\") pod \"cinder-58b0-account-create-update-n4bl6\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.889622 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181581ac-d6d3-4700-bfb7-7179a262a27c-operator-scripts\") pod \"cinder-58b0-account-create-update-n4bl6\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.912576 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-sg7w8"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.916337 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.934506 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqfjv\" (UniqueName: \"kubernetes.io/projected/181581ac-d6d3-4700-bfb7-7179a262a27c-kube-api-access-jqfjv\") pod \"cinder-58b0-account-create-update-n4bl6\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.942066 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sg7w8"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.957223 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-5450-account-create-update-m7kr7"] Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.958249 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:34 crc kubenswrapper[4773]: I0120 18:48:34.962349 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:34.995192 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4p5kk\" (UniqueName: \"kubernetes.io/projected/49d41a48-da79-4b93-bf84-ab8b94fed1c1-kube-api-access-4p5kk\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:34.995565 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-combined-ca-bundle\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:34.995621 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-config-data\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:34.995653 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c9p9r\" (UniqueName: \"kubernetes.io/projected/b742ea09-e1ce-4311-a9bf-7736d3ab235c-kube-api-access-c9p9r\") pod \"neutron-db-create-sg7w8\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:34.995691 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b742ea09-e1ce-4311-a9bf-7736d3ab235c-operator-scripts\") pod \"neutron-db-create-sg7w8\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.020856 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-combined-ca-bundle\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.021569 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4p5kk\" (UniqueName: \"kubernetes.io/projected/49d41a48-da79-4b93-bf84-ab8b94fed1c1-kube-api-access-4p5kk\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.029388 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5450-account-create-update-m7kr7"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.031244 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-config-data\") pod \"keystone-db-sync-kmlg7\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.055295 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.075543 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-t5h8j-config-5522t"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.088341 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-t5h8j-config-5522t"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.096748 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c9p9r\" (UniqueName: \"kubernetes.io/projected/b742ea09-e1ce-4311-a9bf-7736d3ab235c-kube-api-access-c9p9r\") pod \"neutron-db-create-sg7w8\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.096806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2544d2a-4467-4356-9aee-21a75f6efedc-operator-scripts\") pod \"neutron-5450-account-create-update-m7kr7\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.096850 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b742ea09-e1ce-4311-a9bf-7736d3ab235c-operator-scripts\") pod \"neutron-db-create-sg7w8\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.096979 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmthz\" (UniqueName: \"kubernetes.io/projected/b2544d2a-4467-4356-9aee-21a75f6efedc-kube-api-access-lmthz\") pod \"neutron-5450-account-create-update-m7kr7\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.097754 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b742ea09-e1ce-4311-a9bf-7736d3ab235c-operator-scripts\") pod \"neutron-db-create-sg7w8\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.115751 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c9p9r\" (UniqueName: \"kubernetes.io/projected/b742ea09-e1ce-4311-a9bf-7736d3ab235c-kube-api-access-c9p9r\") pod \"neutron-db-create-sg7w8\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.121281 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.198924 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmthz\" (UniqueName: \"kubernetes.io/projected/b2544d2a-4467-4356-9aee-21a75f6efedc-kube-api-access-lmthz\") pod \"neutron-5450-account-create-update-m7kr7\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.199370 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2544d2a-4467-4356-9aee-21a75f6efedc-operator-scripts\") pod \"neutron-5450-account-create-update-m7kr7\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.200247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2544d2a-4467-4356-9aee-21a75f6efedc-operator-scripts\") pod \"neutron-5450-account-create-update-m7kr7\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.227313 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.230510 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmthz\" (UniqueName: \"kubernetes.io/projected/b2544d2a-4467-4356-9aee-21a75f6efedc-kube-api-access-lmthz\") pod \"neutron-5450-account-create-update-m7kr7\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.300169 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-config-data\") pod \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.300283 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-db-sync-config-data\") pod \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.300309 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nxckd\" (UniqueName: \"kubernetes.io/projected/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-kube-api-access-nxckd\") pod \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.300404 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-combined-ca-bundle\") pod \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\" (UID: \"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc\") " Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.303079 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.304326 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" (UID: "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.305659 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-kube-api-access-nxckd" (OuterVolumeSpecName: "kube-api-access-nxckd") pod "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" (UID: "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc"). InnerVolumeSpecName "kube-api-access-nxckd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.326143 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" (UID: "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.346207 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-config-data" (OuterVolumeSpecName: "config-data") pod "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" (UID: "aa7530e2-53e5-4891-9a0e-ff23ee1c61bc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.352604 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.368407 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-fldlp"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.402107 4773 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.402134 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nxckd\" (UniqueName: \"kubernetes.io/projected/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-kube-api-access-nxckd\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.402145 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.402154 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.424545 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-jqhz4"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.474187 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bb055c2-7ce4-425c-8e65-df8438bde346" path="/var/lib/kubelet/pods/5bb055c2-7ce4-425c-8e65-df8438bde346/volumes" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.552358 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-b6ae-account-create-update-xdwz2"] Jan 20 18:48:35 crc kubenswrapper[4773]: W0120 18:48:35.564208 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb313ef44_3ec0_4e2e_bc88_0187cce26783.slice/crio-b684ca83a32a3c7b5158c8beaf630afad626d0df20c559544d0c45b56deb8a8c WatchSource:0}: Error finding container b684ca83a32a3c7b5158c8beaf630afad626d0df20c559544d0c45b56deb8a8c: Status 404 returned error can't find the container with id b684ca83a32a3c7b5158c8beaf630afad626d0df20c559544d0c45b56deb8a8c Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.632130 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-58b0-account-create-update-n4bl6"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.689003 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-29z4h" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.689494 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-29z4h" event={"ID":"aa7530e2-53e5-4891-9a0e-ff23ee1c61bc","Type":"ContainerDied","Data":"dbc468a45d9d73356dc4552388c461ed7acca337b6f3cf6b0fd22751fd3315c5"} Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.689524 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dbc468a45d9d73356dc4552388c461ed7acca337b6f3cf6b0fd22751fd3315c5" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.695468 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jqhz4" event={"ID":"be215ecb-8014-4db1-8eac-59f0d3dee870","Type":"ContainerStarted","Data":"a38683b5fdc91e39029015853b4b8d0f6b2a61a23721dce3500ed6e1d8bf2c84"} Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.695533 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jqhz4" event={"ID":"be215ecb-8014-4db1-8eac-59f0d3dee870","Type":"ContainerStarted","Data":"a955e92c98631ae57ff0855cb259b662175e93be87674e4678a228c77bf3a1ff"} Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.698995 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fldlp" event={"ID":"d813dade-efd1-404d-ae3f-ecea71ffb5ee","Type":"ContainerStarted","Data":"c1e147f31c77c26d0387a0f9416a73c8299ce902514749792047df9c2fed6c5d"} Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.699037 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fldlp" event={"ID":"d813dade-efd1-404d-ae3f-ecea71ffb5ee","Type":"ContainerStarted","Data":"d4b834509066ebd18893c1c72ca8fe3d7ef9e80cb324adea5f6be02adb6f303d"} Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.701035 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b6ae-account-create-update-xdwz2" event={"ID":"b313ef44-3ec0-4e2e-bc88-0187cce26783","Type":"ContainerStarted","Data":"b684ca83a32a3c7b5158c8beaf630afad626d0df20c559544d0c45b56deb8a8c"} Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.731852 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-5450-account-create-update-m7kr7"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.741252 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-create-fldlp" podStartSLOduration=1.7412296710000001 podStartE2EDuration="1.741229671s" podCreationTimestamp="2026-01-20 18:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:35.718143878 +0000 UTC m=+1108.639956902" watchObservedRunningTime="2026-01-20 18:48:35.741229671 +0000 UTC m=+1108.663042695" Jan 20 18:48:35 crc kubenswrapper[4773]: W0120 18:48:35.750579 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb2544d2a_4467_4356_9aee_21a75f6efedc.slice/crio-d551d18c8d8a96ad1f885be8d937a306c668c2ef005f0cd8b1c8c67a9b64b6f7 WatchSource:0}: Error finding container d551d18c8d8a96ad1f885be8d937a306c668c2ef005f0cd8b1c8c67a9b64b6f7: Status 404 returned error can't find the container with id d551d18c8d8a96ad1f885be8d937a306c668c2ef005f0cd8b1c8c67a9b64b6f7 Jan 20 18:48:35 crc kubenswrapper[4773]: W0120 18:48:35.751237 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49d41a48_da79_4b93_bf84_ab8b94fed1c1.slice/crio-25c360ea8bbdacc06b094e0aea0702473fdf5418da4b85ceb4b7f209dcf64994 WatchSource:0}: Error finding container 25c360ea8bbdacc06b094e0aea0702473fdf5418da4b85ceb4b7f209dcf64994: Status 404 returned error can't find the container with id 25c360ea8bbdacc06b094e0aea0702473fdf5418da4b85ceb4b7f209dcf64994 Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.752953 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-jqhz4" podStartSLOduration=1.75291328 podStartE2EDuration="1.75291328s" podCreationTimestamp="2026-01-20 18:48:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:35.732695267 +0000 UTC m=+1108.654508291" watchObservedRunningTime="2026-01-20 18:48:35.75291328 +0000 UTC m=+1108.674726304" Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.766181 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-kmlg7"] Jan 20 18:48:35 crc kubenswrapper[4773]: I0120 18:48:35.816374 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-sg7w8"] Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.015839 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-cgbwn"] Jan 20 18:48:36 crc kubenswrapper[4773]: E0120 18:48:36.016433 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" containerName="glance-db-sync" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.016449 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" containerName="glance-db-sync" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.016670 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" containerName="glance-db-sync" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.017437 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.076817 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-cgbwn"] Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.118260 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.118326 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.118374 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.118432 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-config\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.118916 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5gts\" (UniqueName: \"kubernetes.io/projected/d326b299-f619-4c76-9a10-045d77fa9bae-kube-api-access-g5gts\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.221012 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.221072 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.221126 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.221162 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-config\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.221227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5gts\" (UniqueName: \"kubernetes.io/projected/d326b299-f619-4c76-9a10-045d77fa9bae-kube-api-access-g5gts\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.222535 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-sb\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.222672 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-config\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.223046 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-dns-svc\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.223101 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-nb\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.239173 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5gts\" (UniqueName: \"kubernetes.io/projected/d326b299-f619-4c76-9a10-045d77fa9bae-kube-api-access-g5gts\") pod \"dnsmasq-dns-54f9b7b8d9-cgbwn\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.346756 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.599223 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-cgbwn"] Jan 20 18:48:36 crc kubenswrapper[4773]: W0120 18:48:36.607012 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd326b299_f619_4c76_9a10_045d77fa9bae.slice/crio-4c54b24834508cb1572ff85a3b08c2169655a4dc02c347b28d18c8e1711adbee WatchSource:0}: Error finding container 4c54b24834508cb1572ff85a3b08c2169655a4dc02c347b28d18c8e1711adbee: Status 404 returned error can't find the container with id 4c54b24834508cb1572ff85a3b08c2169655a4dc02c347b28d18c8e1711adbee Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.717363 4773 generic.go:334] "Generic (PLEG): container finished" podID="181581ac-d6d3-4700-bfb7-7179a262a27c" containerID="59d2c461099d25c608c6562b9d212406fcc710ae864054a0764c29095622613a" exitCode=0 Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.717802 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-58b0-account-create-update-n4bl6" event={"ID":"181581ac-d6d3-4700-bfb7-7179a262a27c","Type":"ContainerDied","Data":"59d2c461099d25c608c6562b9d212406fcc710ae864054a0764c29095622613a"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.717828 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-58b0-account-create-update-n4bl6" event={"ID":"181581ac-d6d3-4700-bfb7-7179a262a27c","Type":"ContainerStarted","Data":"4e7e0970af3231a70982a3028b68071ffa3e49fb1a044ec38e898ff030b9ff54"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.719228 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kmlg7" event={"ID":"49d41a48-da79-4b93-bf84-ab8b94fed1c1","Type":"ContainerStarted","Data":"25c360ea8bbdacc06b094e0aea0702473fdf5418da4b85ceb4b7f209dcf64994"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.720838 4773 generic.go:334] "Generic (PLEG): container finished" podID="d813dade-efd1-404d-ae3f-ecea71ffb5ee" containerID="c1e147f31c77c26d0387a0f9416a73c8299ce902514749792047df9c2fed6c5d" exitCode=0 Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.720898 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fldlp" event={"ID":"d813dade-efd1-404d-ae3f-ecea71ffb5ee","Type":"ContainerDied","Data":"c1e147f31c77c26d0387a0f9416a73c8299ce902514749792047df9c2fed6c5d"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.723251 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2544d2a-4467-4356-9aee-21a75f6efedc" containerID="17b8b7c8cb845ba0251348f763aac9652f97d99f1d3fb0947416ad8e58f06104" exitCode=0 Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.723294 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5450-account-create-update-m7kr7" event={"ID":"b2544d2a-4467-4356-9aee-21a75f6efedc","Type":"ContainerDied","Data":"17b8b7c8cb845ba0251348f763aac9652f97d99f1d3fb0947416ad8e58f06104"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.723349 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5450-account-create-update-m7kr7" event={"ID":"b2544d2a-4467-4356-9aee-21a75f6efedc","Type":"ContainerStarted","Data":"d551d18c8d8a96ad1f885be8d937a306c668c2ef005f0cd8b1c8c67a9b64b6f7"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.725089 4773 generic.go:334] "Generic (PLEG): container finished" podID="b313ef44-3ec0-4e2e-bc88-0187cce26783" containerID="3fe035cd5db85387fabc74e84605df50a425cce1a8ad3c3850fcd55fb4b1eaa6" exitCode=0 Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.725736 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b6ae-account-create-update-xdwz2" event={"ID":"b313ef44-3ec0-4e2e-bc88-0187cce26783","Type":"ContainerDied","Data":"3fe035cd5db85387fabc74e84605df50a425cce1a8ad3c3850fcd55fb4b1eaa6"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.731134 4773 generic.go:334] "Generic (PLEG): container finished" podID="b742ea09-e1ce-4311-a9bf-7736d3ab235c" containerID="19b2a1461c2e62cae82675b27637cd9300c36d95cb1554d31376193faaa94e3d" exitCode=0 Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.731201 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sg7w8" event={"ID":"b742ea09-e1ce-4311-a9bf-7736d3ab235c","Type":"ContainerDied","Data":"19b2a1461c2e62cae82675b27637cd9300c36d95cb1554d31376193faaa94e3d"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.731230 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sg7w8" event={"ID":"b742ea09-e1ce-4311-a9bf-7736d3ab235c","Type":"ContainerStarted","Data":"2dcef42d2cba924a7584b09119514eaa3ae2b310ea90a1ebbf7d1a3429ad34aa"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.735544 4773 generic.go:334] "Generic (PLEG): container finished" podID="be215ecb-8014-4db1-8eac-59f0d3dee870" containerID="a38683b5fdc91e39029015853b4b8d0f6b2a61a23721dce3500ed6e1d8bf2c84" exitCode=0 Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.735609 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jqhz4" event={"ID":"be215ecb-8014-4db1-8eac-59f0d3dee870","Type":"ContainerDied","Data":"a38683b5fdc91e39029015853b4b8d0f6b2a61a23721dce3500ed6e1d8bf2c84"} Jan 20 18:48:36 crc kubenswrapper[4773]: I0120 18:48:36.737414 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" event={"ID":"d326b299-f619-4c76-9a10-045d77fa9bae","Type":"ContainerStarted","Data":"4c54b24834508cb1572ff85a3b08c2169655a4dc02c347b28d18c8e1711adbee"} Jan 20 18:48:37 crc kubenswrapper[4773]: I0120 18:48:37.750255 4773 generic.go:334] "Generic (PLEG): container finished" podID="d326b299-f619-4c76-9a10-045d77fa9bae" containerID="526b5b595c5084d4d46dcd86dc9c0555e27f3a5f70cfb4f516507eaf64968118" exitCode=0 Jan 20 18:48:37 crc kubenswrapper[4773]: I0120 18:48:37.750355 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" event={"ID":"d326b299-f619-4c76-9a10-045d77fa9bae","Type":"ContainerDied","Data":"526b5b595c5084d4d46dcd86dc9c0555e27f3a5f70cfb4f516507eaf64968118"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.617146 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.688330 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.695637 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2544d2a-4467-4356-9aee-21a75f6efedc-operator-scripts\") pod \"b2544d2a-4467-4356-9aee-21a75f6efedc\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.695741 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmthz\" (UniqueName: \"kubernetes.io/projected/b2544d2a-4467-4356-9aee-21a75f6efedc-kube-api-access-lmthz\") pod \"b2544d2a-4467-4356-9aee-21a75f6efedc\" (UID: \"b2544d2a-4467-4356-9aee-21a75f6efedc\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.699331 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2544d2a-4467-4356-9aee-21a75f6efedc-kube-api-access-lmthz" (OuterVolumeSpecName: "kube-api-access-lmthz") pod "b2544d2a-4467-4356-9aee-21a75f6efedc" (UID: "b2544d2a-4467-4356-9aee-21a75f6efedc"). InnerVolumeSpecName "kube-api-access-lmthz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.701328 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b2544d2a-4467-4356-9aee-21a75f6efedc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b2544d2a-4467-4356-9aee-21a75f6efedc" (UID: "b2544d2a-4467-4356-9aee-21a75f6efedc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.712055 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.760688 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.770254 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.777872 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.782012 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-58b0-account-create-update-n4bl6" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.782154 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-58b0-account-create-update-n4bl6" event={"ID":"181581ac-d6d3-4700-bfb7-7179a262a27c","Type":"ContainerDied","Data":"4e7e0970af3231a70982a3028b68071ffa3e49fb1a044ec38e898ff030b9ff54"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.782200 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4e7e0970af3231a70982a3028b68071ffa3e49fb1a044ec38e898ff030b9ff54" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.783449 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-fldlp" event={"ID":"d813dade-efd1-404d-ae3f-ecea71ffb5ee","Type":"ContainerDied","Data":"d4b834509066ebd18893c1c72ca8fe3d7ef9e80cb324adea5f6be02adb6f303d"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.783518 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4b834509066ebd18893c1c72ca8fe3d7ef9e80cb324adea5f6be02adb6f303d" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.783492 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-fldlp" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.784452 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-5450-account-create-update-m7kr7" event={"ID":"b2544d2a-4467-4356-9aee-21a75f6efedc","Type":"ContainerDied","Data":"d551d18c8d8a96ad1f885be8d937a306c668c2ef005f0cd8b1c8c67a9b64b6f7"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.784484 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d551d18c8d8a96ad1f885be8d937a306c668c2ef005f0cd8b1c8c67a9b64b6f7" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.784531 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-5450-account-create-update-m7kr7" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.786195 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-b6ae-account-create-update-xdwz2" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.786191 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-b6ae-account-create-update-xdwz2" event={"ID":"b313ef44-3ec0-4e2e-bc88-0187cce26783","Type":"ContainerDied","Data":"b684ca83a32a3c7b5158c8beaf630afad626d0df20c559544d0c45b56deb8a8c"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.786257 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b684ca83a32a3c7b5158c8beaf630afad626d0df20c559544d0c45b56deb8a8c" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.787348 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-sg7w8" event={"ID":"b742ea09-e1ce-4311-a9bf-7736d3ab235c","Type":"ContainerDied","Data":"2dcef42d2cba924a7584b09119514eaa3ae2b310ea90a1ebbf7d1a3429ad34aa"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.787384 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dcef42d2cba924a7584b09119514eaa3ae2b310ea90a1ebbf7d1a3429ad34aa" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.787392 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-sg7w8" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.788369 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-jqhz4" event={"ID":"be215ecb-8014-4db1-8eac-59f0d3dee870","Type":"ContainerDied","Data":"a955e92c98631ae57ff0855cb259b662175e93be87674e4678a228c77bf3a1ff"} Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.788388 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-jqhz4" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.788390 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a955e92c98631ae57ff0855cb259b662175e93be87674e4678a228c77bf3a1ff" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.801329 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lwph\" (UniqueName: \"kubernetes.io/projected/d813dade-efd1-404d-ae3f-ecea71ffb5ee-kube-api-access-2lwph\") pod \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.801412 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be215ecb-8014-4db1-8eac-59f0d3dee870-operator-scripts\") pod \"be215ecb-8014-4db1-8eac-59f0d3dee870\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.801486 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hcrjq\" (UniqueName: \"kubernetes.io/projected/be215ecb-8014-4db1-8eac-59f0d3dee870-kube-api-access-hcrjq\") pod \"be215ecb-8014-4db1-8eac-59f0d3dee870\" (UID: \"be215ecb-8014-4db1-8eac-59f0d3dee870\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.801587 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b742ea09-e1ce-4311-a9bf-7736d3ab235c-operator-scripts\") pod \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.801625 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d813dade-efd1-404d-ae3f-ecea71ffb5ee-operator-scripts\") pod \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\" (UID: \"d813dade-efd1-404d-ae3f-ecea71ffb5ee\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.801710 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c9p9r\" (UniqueName: \"kubernetes.io/projected/b742ea09-e1ce-4311-a9bf-7736d3ab235c-kube-api-access-c9p9r\") pod \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\" (UID: \"b742ea09-e1ce-4311-a9bf-7736d3ab235c\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.802761 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b2544d2a-4467-4356-9aee-21a75f6efedc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.802803 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmthz\" (UniqueName: \"kubernetes.io/projected/b2544d2a-4467-4356-9aee-21a75f6efedc-kube-api-access-lmthz\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.803184 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/be215ecb-8014-4db1-8eac-59f0d3dee870-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "be215ecb-8014-4db1-8eac-59f0d3dee870" (UID: "be215ecb-8014-4db1-8eac-59f0d3dee870"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.804034 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b742ea09-e1ce-4311-a9bf-7736d3ab235c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b742ea09-e1ce-4311-a9bf-7736d3ab235c" (UID: "b742ea09-e1ce-4311-a9bf-7736d3ab235c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.804079 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d813dade-efd1-404d-ae3f-ecea71ffb5ee-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d813dade-efd1-404d-ae3f-ecea71ffb5ee" (UID: "d813dade-efd1-404d-ae3f-ecea71ffb5ee"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.804584 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d813dade-efd1-404d-ae3f-ecea71ffb5ee-kube-api-access-2lwph" (OuterVolumeSpecName: "kube-api-access-2lwph") pod "d813dade-efd1-404d-ae3f-ecea71ffb5ee" (UID: "d813dade-efd1-404d-ae3f-ecea71ffb5ee"). InnerVolumeSpecName "kube-api-access-2lwph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.808944 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b742ea09-e1ce-4311-a9bf-7736d3ab235c-kube-api-access-c9p9r" (OuterVolumeSpecName: "kube-api-access-c9p9r") pod "b742ea09-e1ce-4311-a9bf-7736d3ab235c" (UID: "b742ea09-e1ce-4311-a9bf-7736d3ab235c"). InnerVolumeSpecName "kube-api-access-c9p9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.813281 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/be215ecb-8014-4db1-8eac-59f0d3dee870-kube-api-access-hcrjq" (OuterVolumeSpecName: "kube-api-access-hcrjq") pod "be215ecb-8014-4db1-8eac-59f0d3dee870" (UID: "be215ecb-8014-4db1-8eac-59f0d3dee870"). InnerVolumeSpecName "kube-api-access-hcrjq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.903666 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181581ac-d6d3-4700-bfb7-7179a262a27c-operator-scripts\") pod \"181581ac-d6d3-4700-bfb7-7179a262a27c\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.903761 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqfjv\" (UniqueName: \"kubernetes.io/projected/181581ac-d6d3-4700-bfb7-7179a262a27c-kube-api-access-jqfjv\") pod \"181581ac-d6d3-4700-bfb7-7179a262a27c\" (UID: \"181581ac-d6d3-4700-bfb7-7179a262a27c\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.903846 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b313ef44-3ec0-4e2e-bc88-0187cce26783-operator-scripts\") pod \"b313ef44-3ec0-4e2e-bc88-0187cce26783\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904000 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdx2d\" (UniqueName: \"kubernetes.io/projected/b313ef44-3ec0-4e2e-bc88-0187cce26783-kube-api-access-tdx2d\") pod \"b313ef44-3ec0-4e2e-bc88-0187cce26783\" (UID: \"b313ef44-3ec0-4e2e-bc88-0187cce26783\") " Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904201 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/181581ac-d6d3-4700-bfb7-7179a262a27c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "181581ac-d6d3-4700-bfb7-7179a262a27c" (UID: "181581ac-d6d3-4700-bfb7-7179a262a27c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904494 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2lwph\" (UniqueName: \"kubernetes.io/projected/d813dade-efd1-404d-ae3f-ecea71ffb5ee-kube-api-access-2lwph\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904524 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/be215ecb-8014-4db1-8eac-59f0d3dee870-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904534 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/181581ac-d6d3-4700-bfb7-7179a262a27c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904542 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hcrjq\" (UniqueName: \"kubernetes.io/projected/be215ecb-8014-4db1-8eac-59f0d3dee870-kube-api-access-hcrjq\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904551 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b742ea09-e1ce-4311-a9bf-7736d3ab235c-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904559 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d813dade-efd1-404d-ae3f-ecea71ffb5ee-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904567 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c9p9r\" (UniqueName: \"kubernetes.io/projected/b742ea09-e1ce-4311-a9bf-7736d3ab235c-kube-api-access-c9p9r\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.904492 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b313ef44-3ec0-4e2e-bc88-0187cce26783-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "b313ef44-3ec0-4e2e-bc88-0187cce26783" (UID: "b313ef44-3ec0-4e2e-bc88-0187cce26783"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.906995 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b313ef44-3ec0-4e2e-bc88-0187cce26783-kube-api-access-tdx2d" (OuterVolumeSpecName: "kube-api-access-tdx2d") pod "b313ef44-3ec0-4e2e-bc88-0187cce26783" (UID: "b313ef44-3ec0-4e2e-bc88-0187cce26783"). InnerVolumeSpecName "kube-api-access-tdx2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:40 crc kubenswrapper[4773]: I0120 18:48:40.907510 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/181581ac-d6d3-4700-bfb7-7179a262a27c-kube-api-access-jqfjv" (OuterVolumeSpecName: "kube-api-access-jqfjv") pod "181581ac-d6d3-4700-bfb7-7179a262a27c" (UID: "181581ac-d6d3-4700-bfb7-7179a262a27c"). InnerVolumeSpecName "kube-api-access-jqfjv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.006090 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/b313ef44-3ec0-4e2e-bc88-0187cce26783-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.006130 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdx2d\" (UniqueName: \"kubernetes.io/projected/b313ef44-3ec0-4e2e-bc88-0187cce26783-kube-api-access-tdx2d\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.006142 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqfjv\" (UniqueName: \"kubernetes.io/projected/181581ac-d6d3-4700-bfb7-7179a262a27c-kube-api-access-jqfjv\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.796655 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" event={"ID":"d326b299-f619-4c76-9a10-045d77fa9bae","Type":"ContainerStarted","Data":"1fbc00a0e22d838407b29245dcbad537d566c09051250909671c6675451ecf6e"} Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.796996 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.799062 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kmlg7" event={"ID":"49d41a48-da79-4b93-bf84-ab8b94fed1c1","Type":"ContainerStarted","Data":"a3323f05de08ce0588343432edfe259822e3b8065981e020292a5a4f0e1cd649"} Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.827344 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" podStartSLOduration=6.827321286 podStartE2EDuration="6.827321286s" podCreationTimestamp="2026-01-20 18:48:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:41.819972249 +0000 UTC m=+1114.741785273" watchObservedRunningTime="2026-01-20 18:48:41.827321286 +0000 UTC m=+1114.749134310" Jan 20 18:48:41 crc kubenswrapper[4773]: I0120 18:48:41.843088 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-kmlg7" podStartSLOduration=3.15695669 podStartE2EDuration="7.843066392s" podCreationTimestamp="2026-01-20 18:48:34 +0000 UTC" firstStartedPulling="2026-01-20 18:48:35.754079358 +0000 UTC m=+1108.675892382" lastFinishedPulling="2026-01-20 18:48:40.44018906 +0000 UTC m=+1113.362002084" observedRunningTime="2026-01-20 18:48:41.837748665 +0000 UTC m=+1114.759561689" watchObservedRunningTime="2026-01-20 18:48:41.843066392 +0000 UTC m=+1114.764879416" Jan 20 18:48:44 crc kubenswrapper[4773]: I0120 18:48:44.828799 4773 generic.go:334] "Generic (PLEG): container finished" podID="49d41a48-da79-4b93-bf84-ab8b94fed1c1" containerID="a3323f05de08ce0588343432edfe259822e3b8065981e020292a5a4f0e1cd649" exitCode=0 Jan 20 18:48:44 crc kubenswrapper[4773]: I0120 18:48:44.828844 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kmlg7" event={"ID":"49d41a48-da79-4b93-bf84-ab8b94fed1c1","Type":"ContainerDied","Data":"a3323f05de08ce0588343432edfe259822e3b8065981e020292a5a4f0e1cd649"} Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.163282 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.291572 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-config-data\") pod \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.292343 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-combined-ca-bundle\") pod \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.292418 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4p5kk\" (UniqueName: \"kubernetes.io/projected/49d41a48-da79-4b93-bf84-ab8b94fed1c1-kube-api-access-4p5kk\") pod \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\" (UID: \"49d41a48-da79-4b93-bf84-ab8b94fed1c1\") " Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.296836 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49d41a48-da79-4b93-bf84-ab8b94fed1c1-kube-api-access-4p5kk" (OuterVolumeSpecName: "kube-api-access-4p5kk") pod "49d41a48-da79-4b93-bf84-ab8b94fed1c1" (UID: "49d41a48-da79-4b93-bf84-ab8b94fed1c1"). InnerVolumeSpecName "kube-api-access-4p5kk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.318689 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49d41a48-da79-4b93-bf84-ab8b94fed1c1" (UID: "49d41a48-da79-4b93-bf84-ab8b94fed1c1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.335531 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-config-data" (OuterVolumeSpecName: "config-data") pod "49d41a48-da79-4b93-bf84-ab8b94fed1c1" (UID: "49d41a48-da79-4b93-bf84-ab8b94fed1c1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.349072 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.395434 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.396152 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4p5kk\" (UniqueName: \"kubernetes.io/projected/49d41a48-da79-4b93-bf84-ab8b94fed1c1-kube-api-access-4p5kk\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.396169 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/49d41a48-da79-4b93-bf84-ab8b94fed1c1-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.411717 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2gpg"] Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.412026 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerName="dnsmasq-dns" containerID="cri-o://ea4e2f85245dbb3b24a2d2a6ed359fd69004c830d495d4be0b7bfbd5a1629207" gracePeriod=10 Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.857763 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-kmlg7" event={"ID":"49d41a48-da79-4b93-bf84-ab8b94fed1c1","Type":"ContainerDied","Data":"25c360ea8bbdacc06b094e0aea0702473fdf5418da4b85ceb4b7f209dcf64994"} Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.857802 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="25c360ea8bbdacc06b094e0aea0702473fdf5418da4b85ceb4b7f209dcf64994" Jan 20 18:48:46 crc kubenswrapper[4773]: I0120 18:48:46.857826 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-kmlg7" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.037945 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-kkb9f"] Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038319 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b313ef44-3ec0-4e2e-bc88-0187cce26783" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038339 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b313ef44-3ec0-4e2e-bc88-0187cce26783" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038350 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d813dade-efd1-404d-ae3f-ecea71ffb5ee" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038357 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d813dade-efd1-404d-ae3f-ecea71ffb5ee" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038368 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49d41a48-da79-4b93-bf84-ab8b94fed1c1" containerName="keystone-db-sync" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038374 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="49d41a48-da79-4b93-bf84-ab8b94fed1c1" containerName="keystone-db-sync" Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038387 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b742ea09-e1ce-4311-a9bf-7736d3ab235c" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038393 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b742ea09-e1ce-4311-a9bf-7736d3ab235c" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038404 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="181581ac-d6d3-4700-bfb7-7179a262a27c" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038410 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="181581ac-d6d3-4700-bfb7-7179a262a27c" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038419 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2544d2a-4467-4356-9aee-21a75f6efedc" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038426 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2544d2a-4467-4356-9aee-21a75f6efedc" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: E0120 18:48:47.038436 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="be215ecb-8014-4db1-8eac-59f0d3dee870" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038442 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="be215ecb-8014-4db1-8eac-59f0d3dee870" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038579 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="be215ecb-8014-4db1-8eac-59f0d3dee870" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038599 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2544d2a-4467-4356-9aee-21a75f6efedc" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038608 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d813dade-efd1-404d-ae3f-ecea71ffb5ee" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038617 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b742ea09-e1ce-4311-a9bf-7736d3ab235c" containerName="mariadb-database-create" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038624 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="49d41a48-da79-4b93-bf84-ab8b94fed1c1" containerName="keystone-db-sync" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038633 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="181581ac-d6d3-4700-bfb7-7179a262a27c" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.038641 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b313ef44-3ec0-4e2e-bc88-0187cce26783" containerName="mariadb-account-create-update" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.039134 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.041110 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.041610 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-24qqg" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.041701 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.041814 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.041996 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.052961 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kkb9f"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.066577 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-c4cvs"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.068186 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.088116 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-c4cvs"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.111790 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-combined-ca-bundle\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.111865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.111890 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.111919 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kv7c\" (UniqueName: \"kubernetes.io/projected/b34ae367-2e63-4e91-8c3f-ed0a2a827607-kube-api-access-4kv7c\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115075 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-credential-keys\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115235 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-scripts\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-config\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115404 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-config-data\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115425 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhtrr\" (UniqueName: \"kubernetes.io/projected/03043146-8a8f-465e-b8c2-ca01d39cc070-kube-api-access-rhtrr\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115450 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-fernet-keys\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.115499 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-dns-svc\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217245 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rhtrr\" (UniqueName: \"kubernetes.io/projected/03043146-8a8f-465e-b8c2-ca01d39cc070-kube-api-access-rhtrr\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217296 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-fernet-keys\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217330 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-dns-svc\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-combined-ca-bundle\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217388 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217403 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217423 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kv7c\" (UniqueName: \"kubernetes.io/projected/b34ae367-2e63-4e91-8c3f-ed0a2a827607-kube-api-access-4kv7c\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217439 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-credential-keys\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217466 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-scripts\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217499 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-config\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.217550 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-config-data\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.218663 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-sb\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.218669 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-nb\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.221643 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-credential-keys\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.222549 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-config\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.223440 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-dns-svc\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.226558 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-config-data\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.227459 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-scripts\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.227471 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-fernet-keys\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.237755 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-combined-ca-bundle\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.268005 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-22fkv"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.268988 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.275386 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.275596 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fjv2n" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.282996 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-7bbff4dff5-7w2k2"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.284775 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.297394 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-22fkv"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.297837 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.298048 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.300354 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.300669 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-6p7p9" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.300853 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.309374 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhtrr\" (UniqueName: \"kubernetes.io/projected/03043146-8a8f-465e-b8c2-ca01d39cc070-kube-api-access-rhtrr\") pod \"keystone-bootstrap-kkb9f\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.312828 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kv7c\" (UniqueName: \"kubernetes.io/projected/b34ae367-2e63-4e91-8c3f-ed0a2a827607-kube-api-access-4kv7c\") pod \"dnsmasq-dns-6546db6db7-c4cvs\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.316416 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bbff4dff5-7w2k2"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.324771 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-etc-machine-id\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.324813 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-config-data\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.324832 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdgkq\" (UniqueName: \"kubernetes.io/projected/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-kube-api-access-tdgkq\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.324852 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-scripts\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.324875 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-db-sync-config-data\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.324918 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-combined-ca-bundle\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.364830 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.407988 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-z8p6p"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.409757 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.416548 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.424860 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rvh6j" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.425435 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426421 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-horizon-secret-key\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426541 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-logs\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426636 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-etc-machine-id\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426727 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-config-data\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426798 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tdgkq\" (UniqueName: \"kubernetes.io/projected/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-kube-api-access-tdgkq\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426877 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-scripts\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.426966 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-config-data\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.427053 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-db-sync-config-data\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.427134 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-scripts\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.427227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-combined-ca-bundle\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.427302 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzp99\" (UniqueName: \"kubernetes.io/projected/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-kube-api-access-xzp99\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.427448 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-etc-machine-id\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.433329 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-scripts\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.439066 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-config-data\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.442724 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-combined-ca-bundle\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.442874 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-z8p6p"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.444356 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-db-sync-config-data\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.484324 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdgkq\" (UniqueName: \"kubernetes.io/projected/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-kube-api-access-tdgkq\") pod \"cinder-db-sync-22fkv\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.496308 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.506368 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-49f25"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.507430 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.507985 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.514385 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.514992 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.515252 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.515441 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.515639 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-r9k85" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.527170 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-rz89h"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.528689 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjprd\" (UniqueName: \"kubernetes.io/projected/d9eee838-721f-48cc-a5aa-37644a62d846-kube-api-access-kjprd\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.528842 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzp99\" (UniqueName: \"kubernetes.io/projected/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-kube-api-access-xzp99\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.529038 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-horizon-secret-key\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.529125 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-logs\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.529217 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-combined-ca-bundle\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.529311 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-config-data\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.529404 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-db-sync-config-data\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.529490 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-scripts\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.530215 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-scripts\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.530525 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.530699 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.531554 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-logs\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.533277 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-config-data\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.538426 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7hwmj" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.538666 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-horizon-secret-key\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.538587 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.538727 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.573905 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzp99\" (UniqueName: \"kubernetes.io/projected/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-kube-api-access-xzp99\") pod \"horizon-7bbff4dff5-7w2k2\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.574002 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-c4cvs"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.609140 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-586bf65fdf-tqctk"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.611131 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.625882 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-49f25"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632227 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-scripts\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632351 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b66lw\" (UniqueName: \"kubernetes.io/projected/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-kube-api-access-b66lw\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632378 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mrxd\" (UniqueName: \"kubernetes.io/projected/077faa57-a75d-4f1a-b01e-3fc69ddb5761-kube-api-access-9mrxd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632393 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbz9l\" (UniqueName: \"kubernetes.io/projected/0158a06a-bb30-4d75-904f-90a4c6307fd6-kube-api-access-xbz9l\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632451 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-combined-ca-bundle\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632469 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-combined-ca-bundle\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632527 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-combined-ca-bundle\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632546 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-log-httpd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632605 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-run-httpd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632628 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-db-sync-config-data\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632679 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-config\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632714 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-config-data\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632760 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjprd\" (UniqueName: \"kubernetes.io/projected/d9eee838-721f-48cc-a5aa-37644a62d846-kube-api-access-kjprd\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632789 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.632973 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0158a06a-bb30-4d75-904f-90a4c6307fd6-logs\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.633046 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-scripts\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.633102 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-config-data\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.633215 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.638701 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-db-sync-config-data\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.647114 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rz89h"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.661049 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-combined-ca-bundle\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.673722 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjprd\" (UniqueName: \"kubernetes.io/projected/d9eee838-721f-48cc-a5aa-37644a62d846-kube-api-access-kjprd\") pod \"barbican-db-sync-z8p6p\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.683090 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-586bf65fdf-tqctk"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.715094 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-qwzhh"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.716437 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.729343 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-qwzhh"] Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737037 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-config-data\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737074 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737104 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0158a06a-bb30-4d75-904f-90a4c6307fd6-logs\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737132 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d10de9-a03e-4020-8219-25cb3d9520a5-logs\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737170 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-scripts\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737204 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-config-data\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737234 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737264 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-scripts\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737296 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b66lw\" (UniqueName: \"kubernetes.io/projected/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-kube-api-access-b66lw\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737327 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mrxd\" (UniqueName: \"kubernetes.io/projected/077faa57-a75d-4f1a-b01e-3fc69ddb5761-kube-api-access-9mrxd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737349 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbz9l\" (UniqueName: \"kubernetes.io/projected/0158a06a-bb30-4d75-904f-90a4c6307fd6-kube-api-access-xbz9l\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737381 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-scripts\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737404 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-combined-ca-bundle\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737429 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-combined-ca-bundle\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737452 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-log-httpd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737472 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-config-data\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737507 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z7plc\" (UniqueName: \"kubernetes.io/projected/69d10de9-a03e-4020-8219-25cb3d9520a5-kube-api-access-z7plc\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737532 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-run-httpd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737557 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/69d10de9-a03e-4020-8219-25cb3d9520a5-horizon-secret-key\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.737591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-config\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.738860 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-log-httpd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.741283 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-run-httpd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.742286 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0158a06a-bb30-4d75-904f-90a4c6307fd6-logs\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.742415 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-config\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.747516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.757310 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mrxd\" (UniqueName: \"kubernetes.io/projected/077faa57-a75d-4f1a-b01e-3fc69ddb5761-kube-api-access-9mrxd\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.763999 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-combined-ca-bundle\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.764745 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-config-data\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.764824 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.765191 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-22fkv" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.765457 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-scripts\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.765587 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-combined-ca-bundle\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.765676 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-scripts\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.765720 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbz9l\" (UniqueName: \"kubernetes.io/projected/0158a06a-bb30-4d75-904f-90a4c6307fd6-kube-api-access-xbz9l\") pod \"placement-db-sync-49f25\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.766097 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b66lw\" (UniqueName: \"kubernetes.io/projected/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-kube-api-access-b66lw\") pod \"neutron-db-sync-rz89h\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.766597 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-config-data\") pod \"ceilometer-0\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.770408 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.811392 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838573 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838626 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-scripts\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838653 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838671 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-config-data\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838701 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z7plc\" (UniqueName: \"kubernetes.io/projected/69d10de9-a03e-4020-8219-25cb3d9520a5-kube-api-access-z7plc\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838717 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48nn2\" (UniqueName: \"kubernetes.io/projected/f667f5ef-cefc-40c0-a282-5d502cd45cd2-kube-api-access-48nn2\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838739 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/69d10de9-a03e-4020-8219-25cb3d9520a5-horizon-secret-key\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838760 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838783 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.838825 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d10de9-a03e-4020-8219-25cb3d9520a5-logs\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.839309 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d10de9-a03e-4020-8219-25cb3d9520a5-logs\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.841538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-config-data\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.841760 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-scripts\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.850554 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/69d10de9-a03e-4020-8219-25cb3d9520a5-horizon-secret-key\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.865100 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-49f25" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.865402 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z7plc\" (UniqueName: \"kubernetes.io/projected/69d10de9-a03e-4020-8219-25cb3d9520a5-kube-api-access-z7plc\") pod \"horizon-586bf65fdf-tqctk\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.868001 4773 generic.go:334] "Generic (PLEG): container finished" podID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerID="ea4e2f85245dbb3b24a2d2a6ed359fd69004c830d495d4be0b7bfbd5a1629207" exitCode=0 Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.868037 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" event={"ID":"aba9326a-e499-43a8-9f50-4dc29d62c960","Type":"ContainerDied","Data":"ea4e2f85245dbb3b24a2d2a6ed359fd69004c830d495d4be0b7bfbd5a1629207"} Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.878822 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.899583 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rz89h" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.940096 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.941044 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-dns-svc\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.941954 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.940157 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.944201 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48nn2\" (UniqueName: \"kubernetes.io/projected/f667f5ef-cefc-40c0-a282-5d502cd45cd2-kube-api-access-48nn2\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.944256 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.944286 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.945636 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-nb\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.946282 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-sb\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.964303 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48nn2\" (UniqueName: \"kubernetes.io/projected/f667f5ef-cefc-40c0-a282-5d502cd45cd2-kube-api-access-48nn2\") pod \"dnsmasq-dns-7987f74bbc-qwzhh\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:47 crc kubenswrapper[4773]: I0120 18:48:47.982397 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.050864 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.111476 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-kkb9f"] Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.116924 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-c4cvs"] Jan 20 18:48:48 crc kubenswrapper[4773]: W0120 18:48:48.195905 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb34ae367_2e63_4e91_8c3f_ed0a2a827607.slice/crio-3904f524ced231b24098761162857d1af7b13ea416a01a6c170ee13d87d90e49 WatchSource:0}: Error finding container 3904f524ced231b24098761162857d1af7b13ea416a01a6c170ee13d87d90e49: Status 404 returned error can't find the container with id 3904f524ced231b24098761162857d1af7b13ea416a01a6c170ee13d87d90e49 Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.463774 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.562611 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbd7l\" (UniqueName: \"kubernetes.io/projected/aba9326a-e499-43a8-9f50-4dc29d62c960-kube-api-access-wbd7l\") pod \"aba9326a-e499-43a8-9f50-4dc29d62c960\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.562734 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-dns-svc\") pod \"aba9326a-e499-43a8-9f50-4dc29d62c960\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.562822 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-config\") pod \"aba9326a-e499-43a8-9f50-4dc29d62c960\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.562848 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-nb\") pod \"aba9326a-e499-43a8-9f50-4dc29d62c960\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.562870 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-sb\") pod \"aba9326a-e499-43a8-9f50-4dc29d62c960\" (UID: \"aba9326a-e499-43a8-9f50-4dc29d62c960\") " Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.578544 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aba9326a-e499-43a8-9f50-4dc29d62c960-kube-api-access-wbd7l" (OuterVolumeSpecName: "kube-api-access-wbd7l") pod "aba9326a-e499-43a8-9f50-4dc29d62c960" (UID: "aba9326a-e499-43a8-9f50-4dc29d62c960"). InnerVolumeSpecName "kube-api-access-wbd7l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.606494 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aba9326a-e499-43a8-9f50-4dc29d62c960" (UID: "aba9326a-e499-43a8-9f50-4dc29d62c960"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.607709 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aba9326a-e499-43a8-9f50-4dc29d62c960" (UID: "aba9326a-e499-43a8-9f50-4dc29d62c960"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.609150 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-config" (OuterVolumeSpecName: "config") pod "aba9326a-e499-43a8-9f50-4dc29d62c960" (UID: "aba9326a-e499-43a8-9f50-4dc29d62c960"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.616032 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aba9326a-e499-43a8-9f50-4dc29d62c960" (UID: "aba9326a-e499-43a8-9f50-4dc29d62c960"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.665239 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbd7l\" (UniqueName: \"kubernetes.io/projected/aba9326a-e499-43a8-9f50-4dc29d62c960-kube-api-access-wbd7l\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.665590 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.665603 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.665614 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.665625 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aba9326a-e499-43a8-9f50-4dc29d62c960-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.689150 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-7bbff4dff5-7w2k2"] Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.697310 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-z8p6p"] Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.786873 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-49f25"] Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.798662 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-22fkv"] Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.807364 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-rz89h"] Jan 20 18:48:48 crc kubenswrapper[4773]: W0120 18:48:48.814076 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec857182_f4b2_46cd_8b7f_fdbc443d8a1a.slice/crio-f998b2c2ff801a56c036c6f7586e8cd6607b4ed8aa6a501098037fc5de64b71c WatchSource:0}: Error finding container f998b2c2ff801a56c036c6f7586e8cd6607b4ed8aa6a501098037fc5de64b71c: Status 404 returned error can't find the container with id f998b2c2ff801a56c036c6f7586e8cd6607b4ed8aa6a501098037fc5de64b71c Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.881538 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kkb9f" event={"ID":"03043146-8a8f-465e-b8c2-ca01d39cc070","Type":"ContainerStarted","Data":"a3485916bab8c305597e5171f6d49c43821b0b823ee634ce2d2a67cfaa6f6ad9"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.881584 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kkb9f" event={"ID":"03043146-8a8f-465e-b8c2-ca01d39cc070","Type":"ContainerStarted","Data":"a72d0b517337335c8270dbc86cc90184c82c27aa6edf416c70a51469398c8f1b"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.882431 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-49f25" event={"ID":"0158a06a-bb30-4d75-904f-90a4c6307fd6","Type":"ContainerStarted","Data":"b51bc15af24c493f96b3eab7ab99bb433e14f99c2a5d9c6b5812e8fdfec3a8c6"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.885406 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbff4dff5-7w2k2" event={"ID":"2c3949f0-4faa-4935-8d0f-7ce69d8de08d","Type":"ContainerStarted","Data":"3096768a065586cfd8bc32ac4cab442fcebf027a669b7a4ad518113982f3c163"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.889126 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" event={"ID":"aba9326a-e499-43a8-9f50-4dc29d62c960","Type":"ContainerDied","Data":"4264d71b8e07e5eb9a8f6822d3dd22ead9577270d589def4baeb9a9c2e4760f2"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.889165 4773 scope.go:117] "RemoveContainer" containerID="ea4e2f85245dbb3b24a2d2a6ed359fd69004c830d495d4be0b7bfbd5a1629207" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.889303 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-86db49b7ff-k2gpg" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.904135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rz89h" event={"ID":"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a","Type":"ContainerStarted","Data":"f998b2c2ff801a56c036c6f7586e8cd6607b4ed8aa6a501098037fc5de64b71c"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.906219 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-kkb9f" podStartSLOduration=1.9061948370000001 podStartE2EDuration="1.906194837s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:48.904464866 +0000 UTC m=+1121.826277900" watchObservedRunningTime="2026-01-20 18:48:48.906194837 +0000 UTC m=+1121.828007861" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.909389 4773 generic.go:334] "Generic (PLEG): container finished" podID="b34ae367-2e63-4e91-8c3f-ed0a2a827607" containerID="621615eabffd10d70a89b0b2941e83b4834a489bd87f8668e12ca27326d38aa9" exitCode=0 Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.909497 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" event={"ID":"b34ae367-2e63-4e91-8c3f-ed0a2a827607","Type":"ContainerDied","Data":"621615eabffd10d70a89b0b2941e83b4834a489bd87f8668e12ca27326d38aa9"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.909537 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" event={"ID":"b34ae367-2e63-4e91-8c3f-ed0a2a827607","Type":"ContainerStarted","Data":"3904f524ced231b24098761162857d1af7b13ea416a01a6c170ee13d87d90e49"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.914387 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-z8p6p" event={"ID":"d9eee838-721f-48cc-a5aa-37644a62d846","Type":"ContainerStarted","Data":"aef79cffc84cc91679ae3ca7b132c652022d5b062e4c613e53887b19256d63c1"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.917432 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-22fkv" event={"ID":"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b","Type":"ContainerStarted","Data":"2518423b901342e321aef89bcd042955aa2c984bc05c23caf50e31eea1f65a99"} Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.928613 4773 scope.go:117] "RemoveContainer" containerID="ce5e0274d2b85b8e9dc275c7b174cafb35ea6aacac9812349bb550681a566a4f" Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.964064 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2gpg"] Jan 20 18:48:48 crc kubenswrapper[4773]: I0120 18:48:48.984997 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-86db49b7ff-k2gpg"] Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.004971 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-qwzhh"] Jan 20 18:48:49 crc kubenswrapper[4773]: W0120 18:48:49.009283 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf667f5ef_cefc_40c0_a282_5d502cd45cd2.slice/crio-454cbc1be6d71f7e2439e8567821a33608afd38b017a46c05057d7d0155426fd WatchSource:0}: Error finding container 454cbc1be6d71f7e2439e8567821a33608afd38b017a46c05057d7d0155426fd: Status 404 returned error can't find the container with id 454cbc1be6d71f7e2439e8567821a33608afd38b017a46c05057d7d0155426fd Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.027712 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-586bf65fdf-tqctk"] Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.057481 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.133007 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-586bf65fdf-tqctk"] Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.136763 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.191228 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-864c6579d5-v5vdm"] Jan 20 18:48:49 crc kubenswrapper[4773]: E0120 18:48:49.191637 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerName="init" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.191655 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerName="init" Jan 20 18:48:49 crc kubenswrapper[4773]: E0120 18:48:49.191670 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerName="dnsmasq-dns" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.191677 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerName="dnsmasq-dns" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.191828 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" containerName="dnsmasq-dns" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.210892 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-864c6579d5-v5vdm"] Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.211169 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.280785 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-scripts\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.281056 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-config-data\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.281209 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/609b419f-cc52-4fef-aa49-f64cdbba6755-logs\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.281462 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpvqk\" (UniqueName: \"kubernetes.io/projected/609b419f-cc52-4fef-aa49-f64cdbba6755-kube-api-access-tpvqk\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.281567 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/609b419f-cc52-4fef-aa49-f64cdbba6755-horizon-secret-key\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.382901 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpvqk\" (UniqueName: \"kubernetes.io/projected/609b419f-cc52-4fef-aa49-f64cdbba6755-kube-api-access-tpvqk\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.382968 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/609b419f-cc52-4fef-aa49-f64cdbba6755-horizon-secret-key\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.383001 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-scripts\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.383031 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-config-data\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.383053 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/609b419f-cc52-4fef-aa49-f64cdbba6755-logs\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.383516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/609b419f-cc52-4fef-aa49-f64cdbba6755-logs\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.384631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-scripts\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.385666 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-config-data\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.388164 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/609b419f-cc52-4fef-aa49-f64cdbba6755-horizon-secret-key\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.421493 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpvqk\" (UniqueName: \"kubernetes.io/projected/609b419f-cc52-4fef-aa49-f64cdbba6755-kube-api-access-tpvqk\") pod \"horizon-864c6579d5-v5vdm\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.504039 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aba9326a-e499-43a8-9f50-4dc29d62c960" path="/var/lib/kubelet/pods/aba9326a-e499-43a8-9f50-4dc29d62c960/volumes" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.544221 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.587958 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-sb\") pod \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.588061 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4kv7c\" (UniqueName: \"kubernetes.io/projected/b34ae367-2e63-4e91-8c3f-ed0a2a827607-kube-api-access-4kv7c\") pod \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.588124 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-nb\") pod \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.588322 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-config\") pod \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.588370 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-dns-svc\") pod \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\" (UID: \"b34ae367-2e63-4e91-8c3f-ed0a2a827607\") " Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.597716 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b34ae367-2e63-4e91-8c3f-ed0a2a827607-kube-api-access-4kv7c" (OuterVolumeSpecName: "kube-api-access-4kv7c") pod "b34ae367-2e63-4e91-8c3f-ed0a2a827607" (UID: "b34ae367-2e63-4e91-8c3f-ed0a2a827607"). InnerVolumeSpecName "kube-api-access-4kv7c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.611330 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b34ae367-2e63-4e91-8c3f-ed0a2a827607" (UID: "b34ae367-2e63-4e91-8c3f-ed0a2a827607"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.615486 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b34ae367-2e63-4e91-8c3f-ed0a2a827607" (UID: "b34ae367-2e63-4e91-8c3f-ed0a2a827607"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.618225 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-config" (OuterVolumeSpecName: "config") pod "b34ae367-2e63-4e91-8c3f-ed0a2a827607" (UID: "b34ae367-2e63-4e91-8c3f-ed0a2a827607"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.625537 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b34ae367-2e63-4e91-8c3f-ed0a2a827607" (UID: "b34ae367-2e63-4e91-8c3f-ed0a2a827607"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.640991 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.696597 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.696633 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4kv7c\" (UniqueName: \"kubernetes.io/projected/b34ae367-2e63-4e91-8c3f-ed0a2a827607-kube-api-access-4kv7c\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.696648 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.696661 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.696673 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b34ae367-2e63-4e91-8c3f-ed0a2a827607-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.927912 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" event={"ID":"b34ae367-2e63-4e91-8c3f-ed0a2a827607","Type":"ContainerDied","Data":"3904f524ced231b24098761162857d1af7b13ea416a01a6c170ee13d87d90e49"} Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.928242 4773 scope.go:117] "RemoveContainer" containerID="621615eabffd10d70a89b0b2941e83b4834a489bd87f8668e12ca27326d38aa9" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.928374 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6546db6db7-c4cvs" Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.937151 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-586bf65fdf-tqctk" event={"ID":"69d10de9-a03e-4020-8219-25cb3d9520a5","Type":"ContainerStarted","Data":"0b60f30c0a7ab6c9d2a1ba89c628d4ad9f1438793f4c8b9c55bf5b64d977ae2b"} Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.939157 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerStarted","Data":"d0e8ddb6dbdcbfbf1e26c6a891d80cf5f965501af473ae07fda9dcc295cac646"} Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.942200 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rz89h" event={"ID":"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a","Type":"ContainerStarted","Data":"bd53904198644fe406dce4ba7d96027169199c6966beb39d8a18ad50565e1374"} Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.944477 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" event={"ID":"f667f5ef-cefc-40c0-a282-5d502cd45cd2","Type":"ContainerStarted","Data":"cec265096a69314b656c7eb565783999362cd58d7d21446746b6a8df723167e7"} Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.944511 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" event={"ID":"f667f5ef-cefc-40c0-a282-5d502cd45cd2","Type":"ContainerStarted","Data":"454cbc1be6d71f7e2439e8567821a33608afd38b017a46c05057d7d0155426fd"} Jan 20 18:48:49 crc kubenswrapper[4773]: I0120 18:48:49.978683 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-rz89h" podStartSLOduration=2.9786630499999998 podStartE2EDuration="2.97866305s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:49.960827744 +0000 UTC m=+1122.882640768" watchObservedRunningTime="2026-01-20 18:48:49.97866305 +0000 UTC m=+1122.900476074" Jan 20 18:48:50 crc kubenswrapper[4773]: I0120 18:48:50.056876 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-c4cvs"] Jan 20 18:48:50 crc kubenswrapper[4773]: I0120 18:48:50.063141 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6546db6db7-c4cvs"] Jan 20 18:48:50 crc kubenswrapper[4773]: I0120 18:48:50.079304 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-864c6579d5-v5vdm"] Jan 20 18:48:50 crc kubenswrapper[4773]: I0120 18:48:50.957877 4773 generic.go:334] "Generic (PLEG): container finished" podID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerID="cec265096a69314b656c7eb565783999362cd58d7d21446746b6a8df723167e7" exitCode=0 Jan 20 18:48:50 crc kubenswrapper[4773]: I0120 18:48:50.957955 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" event={"ID":"f667f5ef-cefc-40c0-a282-5d502cd45cd2","Type":"ContainerDied","Data":"cec265096a69314b656c7eb565783999362cd58d7d21446746b6a8df723167e7"} Jan 20 18:48:50 crc kubenswrapper[4773]: I0120 18:48:50.960342 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-864c6579d5-v5vdm" event={"ID":"609b419f-cc52-4fef-aa49-f64cdbba6755","Type":"ContainerStarted","Data":"6a01b8d18b1bc750f52b0b014094ed52c18500c651ec5c625261c7be00925ccf"} Jan 20 18:48:51 crc kubenswrapper[4773]: I0120 18:48:51.458099 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b34ae367-2e63-4e91-8c3f-ed0a2a827607" path="/var/lib/kubelet/pods/b34ae367-2e63-4e91-8c3f-ed0a2a827607/volumes" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.139876 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bbff4dff5-7w2k2"] Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.166040 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-9b66d8476-cqhrd"] Jan 20 18:48:56 crc kubenswrapper[4773]: E0120 18:48:56.166452 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b34ae367-2e63-4e91-8c3f-ed0a2a827607" containerName="init" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.166478 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b34ae367-2e63-4e91-8c3f-ed0a2a827607" containerName="init" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.166691 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b34ae367-2e63-4e91-8c3f-ed0a2a827607" containerName="init" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.167734 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.174034 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.195603 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9b66d8476-cqhrd"] Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.233569 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-secret-key\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.233624 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-tls-certs\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.233663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49df8cea-026f-497b-baae-a6a09452aa3d-logs\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.233695 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-scripts\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.233719 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-combined-ca-bundle\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.233741 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-config-data\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.234052 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhdcl\" (UniqueName: \"kubernetes.io/projected/49df8cea-026f-497b-baae-a6a09452aa3d-kube-api-access-mhdcl\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.278565 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-864c6579d5-v5vdm"] Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.312317 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-68fb89f56b-287lx"] Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.313564 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.327087 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68fb89f56b-287lx"] Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335352 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhdcl\" (UniqueName: \"kubernetes.io/projected/49df8cea-026f-497b-baae-a6a09452aa3d-kube-api-access-mhdcl\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335468 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-secret-key\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335498 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-tls-certs\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335526 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49df8cea-026f-497b-baae-a6a09452aa3d-logs\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335555 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-scripts\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335578 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-combined-ca-bundle\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.335602 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-config-data\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.336754 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-scripts\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.337149 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49df8cea-026f-497b-baae-a6a09452aa3d-logs\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.337163 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-config-data\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.344126 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-secret-key\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.344407 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-tls-certs\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.356448 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-combined-ca-bundle\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.358171 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhdcl\" (UniqueName: \"kubernetes.io/projected/49df8cea-026f-497b-baae-a6a09452aa3d-kube-api-access-mhdcl\") pod \"horizon-9b66d8476-cqhrd\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437437 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-logs\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-horizon-tls-certs\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437543 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpm9k\" (UniqueName: \"kubernetes.io/projected/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-kube-api-access-mpm9k\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437833 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-config-data\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437879 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-scripts\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437897 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-combined-ca-bundle\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.437919 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-horizon-secret-key\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.539909 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-logs\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540008 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-horizon-tls-certs\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540034 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mpm9k\" (UniqueName: \"kubernetes.io/projected/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-kube-api-access-mpm9k\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-config-data\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540154 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-scripts\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540170 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-combined-ca-bundle\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540192 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-horizon-secret-key\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.540470 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-logs\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.541699 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-scripts\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.542467 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-config-data\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.547822 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-horizon-secret-key\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.548642 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-horizon-tls-certs\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.554015 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-combined-ca-bundle\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.557282 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpm9k\" (UniqueName: \"kubernetes.io/projected/cd9ba14c-8dca-4170-841c-6f5d5fa2b220-kube-api-access-mpm9k\") pod \"horizon-68fb89f56b-287lx\" (UID: \"cd9ba14c-8dca-4170-841c-6f5d5fa2b220\") " pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.565084 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:48:56 crc kubenswrapper[4773]: I0120 18:48:56.630993 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:48:57 crc kubenswrapper[4773]: I0120 18:48:57.007991 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" event={"ID":"f667f5ef-cefc-40c0-a282-5d502cd45cd2","Type":"ContainerStarted","Data":"5cc155e36c13c5b618c2477be4ab590ab510095287b6b608b69635e7105f701d"} Jan 20 18:48:57 crc kubenswrapper[4773]: I0120 18:48:57.009008 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:48:57 crc kubenswrapper[4773]: I0120 18:48:57.029761 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" podStartSLOduration=10.029745607 podStartE2EDuration="10.029745607s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:48:57.029152863 +0000 UTC m=+1129.950965907" watchObservedRunningTime="2026-01-20 18:48:57.029745607 +0000 UTC m=+1129.951558631" Jan 20 18:48:58 crc kubenswrapper[4773]: I0120 18:48:58.170167 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:48:58 crc kubenswrapper[4773]: I0120 18:48:58.170441 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:49:00 crc kubenswrapper[4773]: I0120 18:49:00.038023 4773 generic.go:334] "Generic (PLEG): container finished" podID="03043146-8a8f-465e-b8c2-ca01d39cc070" containerID="a3485916bab8c305597e5171f6d49c43821b0b823ee634ce2d2a67cfaa6f6ad9" exitCode=0 Jan 20 18:49:00 crc kubenswrapper[4773]: I0120 18:49:00.038104 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kkb9f" event={"ID":"03043146-8a8f-465e-b8c2-ca01d39cc070","Type":"ContainerDied","Data":"a3485916bab8c305597e5171f6d49c43821b0b823ee634ce2d2a67cfaa6f6ad9"} Jan 20 18:49:03 crc kubenswrapper[4773]: I0120 18:49:03.053143 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:49:03 crc kubenswrapper[4773]: I0120 18:49:03.100841 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-cgbwn"] Jan 20 18:49:03 crc kubenswrapper[4773]: I0120 18:49:03.101079 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" containerID="cri-o://1fbc00a0e22d838407b29245dcbad537d566c09051250909671c6675451ecf6e" gracePeriod=10 Jan 20 18:49:03 crc kubenswrapper[4773]: E0120 18:49:03.814472 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 20 18:49:03 crc kubenswrapper[4773]: E0120 18:49:03.815077 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5cch65bh5f8h586h9h597h57h88h586hdfh5d7h645h688h89h85h5ffh685h675h5ffh554h5c8hd4h585hbh578h5b5h54ch679h5b5h64ch5dh5c7q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xzp99,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-7bbff4dff5-7w2k2_openstack(2c3949f0-4faa-4935-8d0f-7ce69d8de08d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:49:03 crc kubenswrapper[4773]: E0120 18:49:03.817520 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-7bbff4dff5-7w2k2" podUID="2c3949f0-4faa-4935-8d0f-7ce69d8de08d" Jan 20 18:49:04 crc kubenswrapper[4773]: I0120 18:49:04.068755 4773 generic.go:334] "Generic (PLEG): container finished" podID="d326b299-f619-4c76-9a10-045d77fa9bae" containerID="1fbc00a0e22d838407b29245dcbad537d566c09051250909671c6675451ecf6e" exitCode=0 Jan 20 18:49:04 crc kubenswrapper[4773]: I0120 18:49:04.068806 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" event={"ID":"d326b299-f619-4c76-9a10-045d77fa9bae","Type":"ContainerDied","Data":"1fbc00a0e22d838407b29245dcbad537d566c09051250909671c6675451ecf6e"} Jan 20 18:49:06 crc kubenswrapper[4773]: I0120 18:49:06.347649 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Jan 20 18:49:11 crc kubenswrapper[4773]: I0120 18:49:11.347670 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: connect: connection refused" Jan 20 18:49:11 crc kubenswrapper[4773]: E0120 18:49:11.865191 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Jan 20 18:49:11 crc kubenswrapper[4773]: E0120 18:49:11.865373 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n5fbh657h594h659h578h95h9bh69hcfh589h689h545h69hdfhc9h65ch56h569hbdh5b8h5c6h694h559hf6h76h577h5d7h77h694h98h8fh566q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tpvqk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-864c6579d5-v5vdm_openstack(609b419f-cc52-4fef-aa49-f64cdbba6755): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:49:11 crc kubenswrapper[4773]: E0120 18:49:11.868025 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-864c6579d5-v5vdm" podUID="609b419f-cc52-4fef-aa49-f64cdbba6755" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.718791 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.723833 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.872910 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-config-data\") pod \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873015 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-logs\") pod \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873039 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-fernet-keys\") pod \"03043146-8a8f-465e-b8c2-ca01d39cc070\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873085 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rhtrr\" (UniqueName: \"kubernetes.io/projected/03043146-8a8f-465e-b8c2-ca01d39cc070-kube-api-access-rhtrr\") pod \"03043146-8a8f-465e-b8c2-ca01d39cc070\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873106 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-scripts\") pod \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873161 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-credential-keys\") pod \"03043146-8a8f-465e-b8c2-ca01d39cc070\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873196 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-scripts\") pod \"03043146-8a8f-465e-b8c2-ca01d39cc070\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873214 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-horizon-secret-key\") pod \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873245 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xzp99\" (UniqueName: \"kubernetes.io/projected/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-kube-api-access-xzp99\") pod \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\" (UID: \"2c3949f0-4faa-4935-8d0f-7ce69d8de08d\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873265 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-config-data\") pod \"03043146-8a8f-465e-b8c2-ca01d39cc070\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.873298 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-combined-ca-bundle\") pod \"03043146-8a8f-465e-b8c2-ca01d39cc070\" (UID: \"03043146-8a8f-465e-b8c2-ca01d39cc070\") " Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.883606 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-scripts" (OuterVolumeSpecName: "scripts") pod "2c3949f0-4faa-4935-8d0f-7ce69d8de08d" (UID: "2c3949f0-4faa-4935-8d0f-7ce69d8de08d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.888008 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "03043146-8a8f-465e-b8c2-ca01d39cc070" (UID: "03043146-8a8f-465e-b8c2-ca01d39cc070"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.888117 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "03043146-8a8f-465e-b8c2-ca01d39cc070" (UID: "03043146-8a8f-465e-b8c2-ca01d39cc070"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.888365 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-logs" (OuterVolumeSpecName: "logs") pod "2c3949f0-4faa-4935-8d0f-7ce69d8de08d" (UID: "2c3949f0-4faa-4935-8d0f-7ce69d8de08d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.888568 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-config-data" (OuterVolumeSpecName: "config-data") pod "2c3949f0-4faa-4935-8d0f-7ce69d8de08d" (UID: "2c3949f0-4faa-4935-8d0f-7ce69d8de08d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.890697 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-scripts" (OuterVolumeSpecName: "scripts") pod "03043146-8a8f-465e-b8c2-ca01d39cc070" (UID: "03043146-8a8f-465e-b8c2-ca01d39cc070"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.890730 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03043146-8a8f-465e-b8c2-ca01d39cc070-kube-api-access-rhtrr" (OuterVolumeSpecName: "kube-api-access-rhtrr") pod "03043146-8a8f-465e-b8c2-ca01d39cc070" (UID: "03043146-8a8f-465e-b8c2-ca01d39cc070"). InnerVolumeSpecName "kube-api-access-rhtrr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.891410 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2c3949f0-4faa-4935-8d0f-7ce69d8de08d" (UID: "2c3949f0-4faa-4935-8d0f-7ce69d8de08d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.891762 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-kube-api-access-xzp99" (OuterVolumeSpecName: "kube-api-access-xzp99") pod "2c3949f0-4faa-4935-8d0f-7ce69d8de08d" (UID: "2c3949f0-4faa-4935-8d0f-7ce69d8de08d"). InnerVolumeSpecName "kube-api-access-xzp99". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.910252 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03043146-8a8f-465e-b8c2-ca01d39cc070" (UID: "03043146-8a8f-465e-b8c2-ca01d39cc070"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.929706 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-config-data" (OuterVolumeSpecName: "config-data") pod "03043146-8a8f-465e-b8c2-ca01d39cc070" (UID: "03043146-8a8f-465e-b8c2-ca01d39cc070"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975188 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975233 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975246 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975259 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rhtrr\" (UniqueName: \"kubernetes.io/projected/03043146-8a8f-465e-b8c2-ca01d39cc070-kube-api-access-rhtrr\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975274 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975284 4773 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975294 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975302 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975309 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xzp99\" (UniqueName: \"kubernetes.io/projected/2c3949f0-4faa-4935-8d0f-7ce69d8de08d-kube-api-access-xzp99\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975317 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:14 crc kubenswrapper[4773]: I0120 18:49:14.975324 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03043146-8a8f-465e-b8c2-ca01d39cc070-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:15 crc kubenswrapper[4773]: E0120 18:49:15.077170 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified" Jan 20 18:49:15 crc kubenswrapper[4773]: E0120 18:49:15.077320 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:ceilometer-central-agent,Image:quay.io/podified-antelope-centos9/openstack-ceilometer-central:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfh67h666h5c6hch579hcdhch84h65ch5d9h77h668h67fh66h7fh55hc9h56fh546h78h595h59fh569hfbh5ch65fh5cdh5d5h649h66bh555q,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:scripts,ReadOnly:true,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:ceilometer-central-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9mrxd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/python3 /var/lib/openstack/bin/centralhealth.py],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:300,TimeoutSeconds:5,PeriodSeconds:5,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ceilometer-0_openstack(077faa57-a75d-4f1a-b01e-3fc69ddb5761): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.152234 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-kkb9f" event={"ID":"03043146-8a8f-465e-b8c2-ca01d39cc070","Type":"ContainerDied","Data":"a72d0b517337335c8270dbc86cc90184c82c27aa6edf416c70a51469398c8f1b"} Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.152502 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a72d0b517337335c8270dbc86cc90184c82c27aa6edf416c70a51469398c8f1b" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.152288 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-kkb9f" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.153590 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-7bbff4dff5-7w2k2" event={"ID":"2c3949f0-4faa-4935-8d0f-7ce69d8de08d","Type":"ContainerDied","Data":"3096768a065586cfd8bc32ac4cab442fcebf027a669b7a4ad518113982f3c163"} Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.153682 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-7bbff4dff5-7w2k2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.232790 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-7bbff4dff5-7w2k2"] Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.241448 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-7bbff4dff5-7w2k2"] Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.458033 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c3949f0-4faa-4935-8d0f-7ce69d8de08d" path="/var/lib/kubelet/pods/2c3949f0-4faa-4935-8d0f-7ce69d8de08d/volumes" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.790952 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-kkb9f"] Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.799885 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-kkb9f"] Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.906842 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-mhdc2"] Jan 20 18:49:15 crc kubenswrapper[4773]: E0120 18:49:15.907173 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03043146-8a8f-465e-b8c2-ca01d39cc070" containerName="keystone-bootstrap" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.907185 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="03043146-8a8f-465e-b8c2-ca01d39cc070" containerName="keystone-bootstrap" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.907387 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="03043146-8a8f-465e-b8c2-ca01d39cc070" containerName="keystone-bootstrap" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.907870 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mhdc2"] Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.907956 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.929457 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-24qqg" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.930500 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.930597 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.930974 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.931232 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.993122 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-combined-ca-bundle\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.993210 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5kl6\" (UniqueName: \"kubernetes.io/projected/17ca6753-a956-4078-8927-2f2a6c41cb80-kube-api-access-j5kl6\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.993251 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-credential-keys\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.993273 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-fernet-keys\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.993329 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-config-data\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:15 crc kubenswrapper[4773]: I0120 18:49:15.993349 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-scripts\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.094519 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5kl6\" (UniqueName: \"kubernetes.io/projected/17ca6753-a956-4078-8927-2f2a6c41cb80-kube-api-access-j5kl6\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.094587 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-credential-keys\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.094619 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-fernet-keys\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.094662 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-config-data\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.094687 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-scripts\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.094752 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-combined-ca-bundle\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.099407 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-scripts\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.099600 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-credential-keys\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.099827 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-fernet-keys\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.099846 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-combined-ca-bundle\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.110579 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-config-data\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.113440 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5kl6\" (UniqueName: \"kubernetes.io/projected/17ca6753-a956-4078-8927-2f2a6c41cb80-kube-api-access-j5kl6\") pod \"keystone-bootstrap-mhdc2\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: E0120 18:49:16.244242 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Jan 20 18:49:16 crc kubenswrapper[4773]: E0120 18:49:16.244696 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tdgkq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-22fkv_openstack(3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:49:16 crc kubenswrapper[4773]: E0120 18:49:16.246139 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-22fkv" podUID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.262964 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.286602 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.398131 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/609b419f-cc52-4fef-aa49-f64cdbba6755-logs\") pod \"609b419f-cc52-4fef-aa49-f64cdbba6755\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.398249 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/609b419f-cc52-4fef-aa49-f64cdbba6755-horizon-secret-key\") pod \"609b419f-cc52-4fef-aa49-f64cdbba6755\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.398384 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-config-data\") pod \"609b419f-cc52-4fef-aa49-f64cdbba6755\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.398517 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/609b419f-cc52-4fef-aa49-f64cdbba6755-logs" (OuterVolumeSpecName: "logs") pod "609b419f-cc52-4fef-aa49-f64cdbba6755" (UID: "609b419f-cc52-4fef-aa49-f64cdbba6755"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.399093 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-config-data" (OuterVolumeSpecName: "config-data") pod "609b419f-cc52-4fef-aa49-f64cdbba6755" (UID: "609b419f-cc52-4fef-aa49-f64cdbba6755"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.399195 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tpvqk\" (UniqueName: \"kubernetes.io/projected/609b419f-cc52-4fef-aa49-f64cdbba6755-kube-api-access-tpvqk\") pod \"609b419f-cc52-4fef-aa49-f64cdbba6755\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.399644 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-scripts\") pod \"609b419f-cc52-4fef-aa49-f64cdbba6755\" (UID: \"609b419f-cc52-4fef-aa49-f64cdbba6755\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.400074 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-scripts" (OuterVolumeSpecName: "scripts") pod "609b419f-cc52-4fef-aa49-f64cdbba6755" (UID: "609b419f-cc52-4fef-aa49-f64cdbba6755"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.400695 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.400726 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/609b419f-cc52-4fef-aa49-f64cdbba6755-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.400738 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/609b419f-cc52-4fef-aa49-f64cdbba6755-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.402076 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/609b419f-cc52-4fef-aa49-f64cdbba6755-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "609b419f-cc52-4fef-aa49-f64cdbba6755" (UID: "609b419f-cc52-4fef-aa49-f64cdbba6755"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.407337 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/609b419f-cc52-4fef-aa49-f64cdbba6755-kube-api-access-tpvqk" (OuterVolumeSpecName: "kube-api-access-tpvqk") pod "609b419f-cc52-4fef-aa49-f64cdbba6755" (UID: "609b419f-cc52-4fef-aa49-f64cdbba6755"). InnerVolumeSpecName "kube-api-access-tpvqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.501955 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/609b419f-cc52-4fef-aa49-f64cdbba6755-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.502001 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tpvqk\" (UniqueName: \"kubernetes.io/projected/609b419f-cc52-4fef-aa49-f64cdbba6755-kube-api-access-tpvqk\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: E0120 18:49:16.677859 4773 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Jan 20 18:49:16 crc kubenswrapper[4773]: E0120 18:49:16.678018 4773 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kjprd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-z8p6p_openstack(d9eee838-721f-48cc-a5aa-37644a62d846): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Jan 20 18:49:16 crc kubenswrapper[4773]: E0120 18:49:16.679352 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-z8p6p" podUID="d9eee838-721f-48cc-a5aa-37644a62d846" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.685321 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.806802 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-dns-svc\") pod \"d326b299-f619-4c76-9a10-045d77fa9bae\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.807346 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5gts\" (UniqueName: \"kubernetes.io/projected/d326b299-f619-4c76-9a10-045d77fa9bae-kube-api-access-g5gts\") pod \"d326b299-f619-4c76-9a10-045d77fa9bae\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.807478 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-config\") pod \"d326b299-f619-4c76-9a10-045d77fa9bae\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.807549 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-nb\") pod \"d326b299-f619-4c76-9a10-045d77fa9bae\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.807578 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-sb\") pod \"d326b299-f619-4c76-9a10-045d77fa9bae\" (UID: \"d326b299-f619-4c76-9a10-045d77fa9bae\") " Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.811262 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d326b299-f619-4c76-9a10-045d77fa9bae-kube-api-access-g5gts" (OuterVolumeSpecName: "kube-api-access-g5gts") pod "d326b299-f619-4c76-9a10-045d77fa9bae" (UID: "d326b299-f619-4c76-9a10-045d77fa9bae"). InnerVolumeSpecName "kube-api-access-g5gts". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.849980 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "d326b299-f619-4c76-9a10-045d77fa9bae" (UID: "d326b299-f619-4c76-9a10-045d77fa9bae"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.852886 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "d326b299-f619-4c76-9a10-045d77fa9bae" (UID: "d326b299-f619-4c76-9a10-045d77fa9bae"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.856380 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-config" (OuterVolumeSpecName: "config") pod "d326b299-f619-4c76-9a10-045d77fa9bae" (UID: "d326b299-f619-4c76-9a10-045d77fa9bae"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.858484 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "d326b299-f619-4c76-9a10-045d77fa9bae" (UID: "d326b299-f619-4c76-9a10-045d77fa9bae"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.911222 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.911258 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.911268 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.911278 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5gts\" (UniqueName: \"kubernetes.io/projected/d326b299-f619-4c76-9a10-045d77fa9bae-kube-api-access-g5gts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:16 crc kubenswrapper[4773]: I0120 18:49:16.911287 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d326b299-f619-4c76-9a10-045d77fa9bae-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.145591 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-68fb89f56b-287lx"] Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.170504 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-9b66d8476-cqhrd"] Jan 20 18:49:17 crc kubenswrapper[4773]: W0120 18:49:17.180856 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49df8cea_026f_497b_baae_a6a09452aa3d.slice/crio-45297b3ebf564b3f31091634eeea46bead8ac5c12e876ebfb7ba0eef2c596c1a WatchSource:0}: Error finding container 45297b3ebf564b3f31091634eeea46bead8ac5c12e876ebfb7ba0eef2c596c1a: Status 404 returned error can't find the container with id 45297b3ebf564b3f31091634eeea46bead8ac5c12e876ebfb7ba0eef2c596c1a Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.188161 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68fb89f56b-287lx" event={"ID":"cd9ba14c-8dca-4170-841c-6f5d5fa2b220","Type":"ContainerStarted","Data":"e7763efb571d5026932f6475db9eef517adc622a0b2069ed44b92f35a03ab807"} Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.197718 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" event={"ID":"d326b299-f619-4c76-9a10-045d77fa9bae","Type":"ContainerDied","Data":"4c54b24834508cb1572ff85a3b08c2169655a4dc02c347b28d18c8e1711adbee"} Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.197765 4773 scope.go:117] "RemoveContainer" containerID="1fbc00a0e22d838407b29245dcbad537d566c09051250909671c6675451ecf6e" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.197879 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.202655 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-864c6579d5-v5vdm" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.207784 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-864c6579d5-v5vdm" event={"ID":"609b419f-cc52-4fef-aa49-f64cdbba6755","Type":"ContainerDied","Data":"6a01b8d18b1bc750f52b0b014094ed52c18500c651ec5c625261c7be00925ccf"} Jan 20 18:49:17 crc kubenswrapper[4773]: E0120 18:49:17.224013 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-22fkv" podUID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" Jan 20 18:49:17 crc kubenswrapper[4773]: E0120 18:49:17.225053 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-z8p6p" podUID="d9eee838-721f-48cc-a5aa-37644a62d846" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.317093 4773 scope.go:117] "RemoveContainer" containerID="526b5b595c5084d4d46dcd86dc9c0555e27f3a5f70cfb4f516507eaf64968118" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.393693 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-864c6579d5-v5vdm"] Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.407494 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-864c6579d5-v5vdm"] Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.419093 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-cgbwn"] Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.428235 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-54f9b7b8d9-cgbwn"] Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.466460 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03043146-8a8f-465e-b8c2-ca01d39cc070" path="/var/lib/kubelet/pods/03043146-8a8f-465e-b8c2-ca01d39cc070/volumes" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.467393 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="609b419f-cc52-4fef-aa49-f64cdbba6755" path="/var/lib/kubelet/pods/609b419f-cc52-4fef-aa49-f64cdbba6755/volumes" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.467906 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" path="/var/lib/kubelet/pods/d326b299-f619-4c76-9a10-045d77fa9bae/volumes" Jan 20 18:49:17 crc kubenswrapper[4773]: I0120 18:49:17.533103 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-mhdc2"] Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.210420 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-49f25" event={"ID":"0158a06a-bb30-4d75-904f-90a4c6307fd6","Type":"ContainerStarted","Data":"3321d4830a85f50c04166b887254333d17cf16fc0f4241d05645223c19fa5071"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.214024 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerStarted","Data":"2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.215795 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b66d8476-cqhrd" event={"ID":"49df8cea-026f-497b-baae-a6a09452aa3d","Type":"ContainerStarted","Data":"aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.215824 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b66d8476-cqhrd" event={"ID":"49df8cea-026f-497b-baae-a6a09452aa3d","Type":"ContainerStarted","Data":"7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.215833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b66d8476-cqhrd" event={"ID":"49df8cea-026f-497b-baae-a6a09452aa3d","Type":"ContainerStarted","Data":"45297b3ebf564b3f31091634eeea46bead8ac5c12e876ebfb7ba0eef2c596c1a"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.218364 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68fb89f56b-287lx" event={"ID":"cd9ba14c-8dca-4170-841c-6f5d5fa2b220","Type":"ContainerStarted","Data":"1945832064b5826b2347bb04240fd99ca07f116eef7af8940627839637ffed49"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.218401 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-68fb89f56b-287lx" event={"ID":"cd9ba14c-8dca-4170-841c-6f5d5fa2b220","Type":"ContainerStarted","Data":"c5fbc0607d6ceecd0698941eec371e01c10343f0285f2f65cacf67e32405f538"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.227592 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-49f25" podStartSLOduration=3.821026511 podStartE2EDuration="31.227574184s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="2026-01-20 18:48:48.789458367 +0000 UTC m=+1121.711271391" lastFinishedPulling="2026-01-20 18:49:16.19600604 +0000 UTC m=+1149.117819064" observedRunningTime="2026-01-20 18:49:18.22366557 +0000 UTC m=+1151.145478594" watchObservedRunningTime="2026-01-20 18:49:18.227574184 +0000 UTC m=+1151.149387208" Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.228557 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mhdc2" event={"ID":"17ca6753-a956-4078-8927-2f2a6c41cb80","Type":"ContainerStarted","Data":"dff8a4c86d068e27a71833f5b56e122b543d819294820eb96fea705ee47ddabe"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.228591 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mhdc2" event={"ID":"17ca6753-a956-4078-8927-2f2a6c41cb80","Type":"ContainerStarted","Data":"75f518016f47bd1abdc74af37fefb093e1d470ea4451d239b9c0799a72e57c8d"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.235343 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-586bf65fdf-tqctk" event={"ID":"69d10de9-a03e-4020-8219-25cb3d9520a5","Type":"ContainerStarted","Data":"2d5d679feaca700190bea6a965fb83e9b9e97a2c3cc8d0cf71e39aeec304eb27"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.235387 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-586bf65fdf-tqctk" event={"ID":"69d10de9-a03e-4020-8219-25cb3d9520a5","Type":"ContainerStarted","Data":"25c305e514555ed0d81791fef19992b8b93de8b3633a72153010568146bc67e2"} Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.235422 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-586bf65fdf-tqctk" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon-log" containerID="cri-o://25c305e514555ed0d81791fef19992b8b93de8b3633a72153010568146bc67e2" gracePeriod=30 Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.235439 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-586bf65fdf-tqctk" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon" containerID="cri-o://2d5d679feaca700190bea6a965fb83e9b9e97a2c3cc8d0cf71e39aeec304eb27" gracePeriod=30 Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.252618 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-68fb89f56b-287lx" podStartSLOduration=22.252597692 podStartE2EDuration="22.252597692s" podCreationTimestamp="2026-01-20 18:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:18.240737549 +0000 UTC m=+1151.162550593" watchObservedRunningTime="2026-01-20 18:49:18.252597692 +0000 UTC m=+1151.174410726" Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.264572 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-9b66d8476-cqhrd" podStartSLOduration=22.264551998 podStartE2EDuration="22.264551998s" podCreationTimestamp="2026-01-20 18:48:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:18.262428627 +0000 UTC m=+1151.184241651" watchObservedRunningTime="2026-01-20 18:49:18.264551998 +0000 UTC m=+1151.186365022" Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.285774 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-mhdc2" podStartSLOduration=3.285755095 podStartE2EDuration="3.285755095s" podCreationTimestamp="2026-01-20 18:49:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:18.281142584 +0000 UTC m=+1151.202955628" watchObservedRunningTime="2026-01-20 18:49:18.285755095 +0000 UTC m=+1151.207568129" Jan 20 18:49:18 crc kubenswrapper[4773]: I0120 18:49:18.336157 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-586bf65fdf-tqctk" podStartSLOduration=3.6725399899999998 podStartE2EDuration="31.33613374s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="2026-01-20 18:48:49.002722065 +0000 UTC m=+1121.924535089" lastFinishedPulling="2026-01-20 18:49:16.666315815 +0000 UTC m=+1149.588128839" observedRunningTime="2026-01-20 18:49:18.298230424 +0000 UTC m=+1151.220043468" watchObservedRunningTime="2026-01-20 18:49:18.33613374 +0000 UTC m=+1151.257946774" Jan 20 18:49:19 crc kubenswrapper[4773]: I0120 18:49:19.257848 4773 generic.go:334] "Generic (PLEG): container finished" podID="ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" containerID="bd53904198644fe406dce4ba7d96027169199c6966beb39d8a18ad50565e1374" exitCode=0 Jan 20 18:49:19 crc kubenswrapper[4773]: I0120 18:49:19.257968 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rz89h" event={"ID":"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a","Type":"ContainerDied","Data":"bd53904198644fe406dce4ba7d96027169199c6966beb39d8a18ad50565e1374"} Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.623579 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rz89h" Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.788174 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-config\") pod \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.788410 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-combined-ca-bundle\") pod \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.788493 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b66lw\" (UniqueName: \"kubernetes.io/projected/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-kube-api-access-b66lw\") pod \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\" (UID: \"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a\") " Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.793450 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-kube-api-access-b66lw" (OuterVolumeSpecName: "kube-api-access-b66lw") pod "ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" (UID: "ec857182-f4b2-46cd-8b7f-fdbc443d8a1a"). InnerVolumeSpecName "kube-api-access-b66lw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.821359 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" (UID: "ec857182-f4b2-46cd-8b7f-fdbc443d8a1a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.839350 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-config" (OuterVolumeSpecName: "config") pod "ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" (UID: "ec857182-f4b2-46cd-8b7f-fdbc443d8a1a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.890650 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b66lw\" (UniqueName: \"kubernetes.io/projected/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-kube-api-access-b66lw\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.890687 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:20 crc kubenswrapper[4773]: I0120 18:49:20.890698 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.279425 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-rz89h" event={"ID":"ec857182-f4b2-46cd-8b7f-fdbc443d8a1a","Type":"ContainerDied","Data":"f998b2c2ff801a56c036c6f7586e8cd6607b4ed8aa6a501098037fc5de64b71c"} Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.279671 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f998b2c2ff801a56c036c6f7586e8cd6607b4ed8aa6a501098037fc5de64b71c" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.279709 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-rz89h" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.348175 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-54f9b7b8d9-cgbwn" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.131:5353: i/o timeout" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.538833 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-vd86k"] Jan 20 18:49:21 crc kubenswrapper[4773]: E0120 18:49:21.539467 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="init" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.539491 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="init" Jan 20 18:49:21 crc kubenswrapper[4773]: E0120 18:49:21.539529 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" containerName="neutron-db-sync" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.539539 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" containerName="neutron-db-sync" Jan 20 18:49:21 crc kubenswrapper[4773]: E0120 18:49:21.539562 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.539571 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.539766 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" containerName="neutron-db-sync" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.539792 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d326b299-f619-4c76-9a10-045d77fa9bae" containerName="dnsmasq-dns" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.540868 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.554974 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-vd86k"] Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.676058 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-86d844bb6-6q8ms"] Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.684520 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.690050 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.690392 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.690549 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.690681 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-7hwmj" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.690797 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86d844bb6-6q8ms"] Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.703336 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpcp4\" (UniqueName: \"kubernetes.io/projected/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-kube-api-access-dpcp4\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.703384 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.703403 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.703432 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-config\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.703506 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-dns-svc\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.804958 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-dns-svc\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805074 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dpcp4\" (UniqueName: \"kubernetes.io/projected/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-kube-api-access-dpcp4\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805100 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-combined-ca-bundle\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805139 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805158 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805191 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-config\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805217 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2hnc\" (UniqueName: \"kubernetes.io/projected/a81115d7-0fb0-4319-9705-0fae198ad70b-kube-api-access-j2hnc\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805240 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-config\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805277 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-ovndb-tls-certs\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.805320 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-httpd-config\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.807339 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-nb\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.807378 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-sb\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.807788 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-config\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.808490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-dns-svc\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.842232 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpcp4\" (UniqueName: \"kubernetes.io/projected/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-kube-api-access-dpcp4\") pod \"dnsmasq-dns-7b946d459c-vd86k\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.877422 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.907192 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j2hnc\" (UniqueName: \"kubernetes.io/projected/a81115d7-0fb0-4319-9705-0fae198ad70b-kube-api-access-j2hnc\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.907234 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-config\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.907265 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-ovndb-tls-certs\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.907300 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-httpd-config\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.907378 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-combined-ca-bundle\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.911663 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-ovndb-tls-certs\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.911690 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-httpd-config\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.916453 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-combined-ca-bundle\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.924005 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-config\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:21 crc kubenswrapper[4773]: I0120 18:49:21.933619 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2hnc\" (UniqueName: \"kubernetes.io/projected/a81115d7-0fb0-4319-9705-0fae198ad70b-kube-api-access-j2hnc\") pod \"neutron-86d844bb6-6q8ms\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:22 crc kubenswrapper[4773]: I0120 18:49:22.009761 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:22 crc kubenswrapper[4773]: I0120 18:49:22.512873 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-vd86k"] Jan 20 18:49:22 crc kubenswrapper[4773]: I0120 18:49:22.723264 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-86d844bb6-6q8ms"] Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.550398 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-76cffc5d9-m6wn7"] Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.552539 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.557162 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.558955 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.577331 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76cffc5d9-m6wn7"] Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.648171 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj54t\" (UniqueName: \"kubernetes.io/projected/f98c94f3-5e79-4d1a-9e1f-bab68689f193-kube-api-access-xj54t\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.648503 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-config\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.648692 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-httpd-config\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.649197 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-public-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.649342 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-combined-ca-bundle\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.649463 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-ovndb-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.649559 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-internal-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-public-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751173 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-combined-ca-bundle\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751211 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-ovndb-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-internal-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751287 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xj54t\" (UniqueName: \"kubernetes.io/projected/f98c94f3-5e79-4d1a-9e1f-bab68689f193-kube-api-access-xj54t\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751379 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-config\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.751455 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-httpd-config\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.787194 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-public-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.788256 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-ovndb-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.799427 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj54t\" (UniqueName: \"kubernetes.io/projected/f98c94f3-5e79-4d1a-9e1f-bab68689f193-kube-api-access-xj54t\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.799766 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-config\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.821913 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-httpd-config\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.822227 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-internal-tls-certs\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.822825 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f98c94f3-5e79-4d1a-9e1f-bab68689f193-combined-ca-bundle\") pod \"neutron-76cffc5d9-m6wn7\" (UID: \"f98c94f3-5e79-4d1a-9e1f-bab68689f193\") " pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:23 crc kubenswrapper[4773]: I0120 18:49:23.881571 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:25 crc kubenswrapper[4773]: I0120 18:49:25.321546 4773 generic.go:334] "Generic (PLEG): container finished" podID="17ca6753-a956-4078-8927-2f2a6c41cb80" containerID="dff8a4c86d068e27a71833f5b56e122b543d819294820eb96fea705ee47ddabe" exitCode=0 Jan 20 18:49:25 crc kubenswrapper[4773]: I0120 18:49:25.321608 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mhdc2" event={"ID":"17ca6753-a956-4078-8927-2f2a6c41cb80","Type":"ContainerDied","Data":"dff8a4c86d068e27a71833f5b56e122b543d819294820eb96fea705ee47ddabe"} Jan 20 18:49:25 crc kubenswrapper[4773]: I0120 18:49:25.323101 4773 generic.go:334] "Generic (PLEG): container finished" podID="0158a06a-bb30-4d75-904f-90a4c6307fd6" containerID="3321d4830a85f50c04166b887254333d17cf16fc0f4241d05645223c19fa5071" exitCode=0 Jan 20 18:49:25 crc kubenswrapper[4773]: I0120 18:49:25.323126 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-49f25" event={"ID":"0158a06a-bb30-4d75-904f-90a4c6307fd6","Type":"ContainerDied","Data":"3321d4830a85f50c04166b887254333d17cf16fc0f4241d05645223c19fa5071"} Jan 20 18:49:26 crc kubenswrapper[4773]: I0120 18:49:26.567993 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:49:26 crc kubenswrapper[4773]: I0120 18:49:26.568357 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:49:26 crc kubenswrapper[4773]: I0120 18:49:26.631649 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:49:26 crc kubenswrapper[4773]: I0120 18:49:26.631694 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.261271 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.261677 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-49f25" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.351602 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-mhdc2" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.351721 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-mhdc2" event={"ID":"17ca6753-a956-4078-8927-2f2a6c41cb80","Type":"ContainerDied","Data":"75f518016f47bd1abdc74af37fefb093e1d470ea4451d239b9c0799a72e57c8d"} Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.351767 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="75f518016f47bd1abdc74af37fefb093e1d470ea4451d239b9c0799a72e57c8d" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.357262 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-49f25" event={"ID":"0158a06a-bb30-4d75-904f-90a4c6307fd6","Type":"ContainerDied","Data":"b51bc15af24c493f96b3eab7ab99bb433e14f99c2a5d9c6b5812e8fdfec3a8c6"} Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.357297 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b51bc15af24c493f96b3eab7ab99bb433e14f99c2a5d9c6b5812e8fdfec3a8c6" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.357369 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-49f25" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.365149 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d844bb6-6q8ms" event={"ID":"a81115d7-0fb0-4319-9705-0fae198ad70b","Type":"ContainerStarted","Data":"4e34f5d6513de50dbefb964db35642a2f245e6de8a45b2992d0938120feea1ea"} Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.369855 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" event={"ID":"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6","Type":"ContainerStarted","Data":"de64a54b4415566f010f514c19c1ed77dea98963569b16a8bef020db9b593d9c"} Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.415488 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-config-data\") pod \"17ca6753-a956-4078-8927-2f2a6c41cb80\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.415862 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-scripts\") pod \"0158a06a-bb30-4d75-904f-90a4c6307fd6\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.415943 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-fernet-keys\") pod \"17ca6753-a956-4078-8927-2f2a6c41cb80\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416218 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5kl6\" (UniqueName: \"kubernetes.io/projected/17ca6753-a956-4078-8927-2f2a6c41cb80-kube-api-access-j5kl6\") pod \"17ca6753-a956-4078-8927-2f2a6c41cb80\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416317 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbz9l\" (UniqueName: \"kubernetes.io/projected/0158a06a-bb30-4d75-904f-90a4c6307fd6-kube-api-access-xbz9l\") pod \"0158a06a-bb30-4d75-904f-90a4c6307fd6\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416341 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-config-data\") pod \"0158a06a-bb30-4d75-904f-90a4c6307fd6\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416390 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-scripts\") pod \"17ca6753-a956-4078-8927-2f2a6c41cb80\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416423 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-credential-keys\") pod \"17ca6753-a956-4078-8927-2f2a6c41cb80\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416486 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0158a06a-bb30-4d75-904f-90a4c6307fd6-logs\") pod \"0158a06a-bb30-4d75-904f-90a4c6307fd6\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416514 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-combined-ca-bundle\") pod \"17ca6753-a956-4078-8927-2f2a6c41cb80\" (UID: \"17ca6753-a956-4078-8927-2f2a6c41cb80\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.416563 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-combined-ca-bundle\") pod \"0158a06a-bb30-4d75-904f-90a4c6307fd6\" (UID: \"0158a06a-bb30-4d75-904f-90a4c6307fd6\") " Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.421051 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0158a06a-bb30-4d75-904f-90a4c6307fd6-kube-api-access-xbz9l" (OuterVolumeSpecName: "kube-api-access-xbz9l") pod "0158a06a-bb30-4d75-904f-90a4c6307fd6" (UID: "0158a06a-bb30-4d75-904f-90a4c6307fd6"). InnerVolumeSpecName "kube-api-access-xbz9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.423465 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0158a06a-bb30-4d75-904f-90a4c6307fd6-logs" (OuterVolumeSpecName: "logs") pod "0158a06a-bb30-4d75-904f-90a4c6307fd6" (UID: "0158a06a-bb30-4d75-904f-90a4c6307fd6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.426611 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "17ca6753-a956-4078-8927-2f2a6c41cb80" (UID: "17ca6753-a956-4078-8927-2f2a6c41cb80"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.428592 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-scripts" (OuterVolumeSpecName: "scripts") pod "0158a06a-bb30-4d75-904f-90a4c6307fd6" (UID: "0158a06a-bb30-4d75-904f-90a4c6307fd6"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.435675 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17ca6753-a956-4078-8927-2f2a6c41cb80-kube-api-access-j5kl6" (OuterVolumeSpecName: "kube-api-access-j5kl6") pod "17ca6753-a956-4078-8927-2f2a6c41cb80" (UID: "17ca6753-a956-4078-8927-2f2a6c41cb80"). InnerVolumeSpecName "kube-api-access-j5kl6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.435675 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "17ca6753-a956-4078-8927-2f2a6c41cb80" (UID: "17ca6753-a956-4078-8927-2f2a6c41cb80"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.438089 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-scripts" (OuterVolumeSpecName: "scripts") pod "17ca6753-a956-4078-8927-2f2a6c41cb80" (UID: "17ca6753-a956-4078-8927-2f2a6c41cb80"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.503114 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-config-data" (OuterVolumeSpecName: "config-data") pod "0158a06a-bb30-4d75-904f-90a4c6307fd6" (UID: "0158a06a-bb30-4d75-904f-90a4c6307fd6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.517191 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "17ca6753-a956-4078-8927-2f2a6c41cb80" (UID: "17ca6753-a956-4078-8927-2f2a6c41cb80"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518524 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518543 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518557 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5kl6\" (UniqueName: \"kubernetes.io/projected/17ca6753-a956-4078-8927-2f2a6c41cb80-kube-api-access-j5kl6\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518570 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbz9l\" (UniqueName: \"kubernetes.io/projected/0158a06a-bb30-4d75-904f-90a4c6307fd6-kube-api-access-xbz9l\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518582 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518593 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518604 4773 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518618 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0158a06a-bb30-4d75-904f-90a4c6307fd6-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.518629 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.521756 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-config-data" (OuterVolumeSpecName: "config-data") pod "17ca6753-a956-4078-8927-2f2a6c41cb80" (UID: "17ca6753-a956-4078-8927-2f2a6c41cb80"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.542843 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "0158a06a-bb30-4d75-904f-90a4c6307fd6" (UID: "0158a06a-bb30-4d75-904f-90a4c6307fd6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.575837 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-5bdd8cdbd7-xhf92"] Jan 20 18:49:27 crc kubenswrapper[4773]: E0120 18:49:27.576308 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17ca6753-a956-4078-8927-2f2a6c41cb80" containerName="keystone-bootstrap" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.576326 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="17ca6753-a956-4078-8927-2f2a6c41cb80" containerName="keystone-bootstrap" Jan 20 18:49:27 crc kubenswrapper[4773]: E0120 18:49:27.576346 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0158a06a-bb30-4d75-904f-90a4c6307fd6" containerName="placement-db-sync" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.576354 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0158a06a-bb30-4d75-904f-90a4c6307fd6" containerName="placement-db-sync" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.576538 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0158a06a-bb30-4d75-904f-90a4c6307fd6" containerName="placement-db-sync" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.576558 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="17ca6753-a956-4078-8927-2f2a6c41cb80" containerName="keystone-bootstrap" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.577267 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.584043 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-668885694d-2br7g"] Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.585757 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.588702 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.589385 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.598652 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.598885 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.610588 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bdd8cdbd7-xhf92"] Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.622212 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0158a06a-bb30-4d75-904f-90a4c6307fd6-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.622244 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/17ca6753-a956-4078-8927-2f2a6c41cb80-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.629025 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-668885694d-2br7g"] Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.724807 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkvg5\" (UniqueName: \"kubernetes.io/projected/03658323-86f4-42ec-b18f-163a1e7dcaed-kube-api-access-zkvg5\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725022 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-credential-keys\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725047 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-internal-tls-certs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725171 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-config-data\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725198 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-combined-ca-bundle\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725340 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-public-tls-certs\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725408 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-scripts\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-internal-tls-certs\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725550 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7bad355-1a37-4372-9751-25a39f6a3410-logs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725574 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-fernet-keys\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725629 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-combined-ca-bundle\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725664 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-scripts\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725725 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-config-data\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725763 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-public-tls-certs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.725835 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr47t\" (UniqueName: \"kubernetes.io/projected/a7bad355-1a37-4372-9751-25a39f6a3410-kube-api-access-wr47t\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.735985 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-76cffc5d9-m6wn7"] Jan 20 18:49:27 crc kubenswrapper[4773]: W0120 18:49:27.741007 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf98c94f3_5e79_4d1a_9e1f_bab68689f193.slice/crio-22592765aa743e2e58e9824992364431013c189c2e5f3c84f0626f7194f20819 WatchSource:0}: Error finding container 22592765aa743e2e58e9824992364431013c189c2e5f3c84f0626f7194f20819: Status 404 returned error can't find the container with id 22592765aa743e2e58e9824992364431013c189c2e5f3c84f0626f7194f20819 Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827316 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-config-data\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827387 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-public-tls-certs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827413 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr47t\" (UniqueName: \"kubernetes.io/projected/a7bad355-1a37-4372-9751-25a39f6a3410-kube-api-access-wr47t\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827452 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkvg5\" (UniqueName: \"kubernetes.io/projected/03658323-86f4-42ec-b18f-163a1e7dcaed-kube-api-access-zkvg5\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827503 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-credential-keys\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827526 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-internal-tls-certs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827552 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-config-data\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827572 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-combined-ca-bundle\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827602 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-public-tls-certs\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827628 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-scripts\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827651 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-internal-tls-certs\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827667 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7bad355-1a37-4372-9751-25a39f6a3410-logs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827688 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-fernet-keys\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827732 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-combined-ca-bundle\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.827759 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-scripts\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.832501 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-scripts\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.832779 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a7bad355-1a37-4372-9751-25a39f6a3410-logs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.834890 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-config-data\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.840395 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-internal-tls-certs\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.840862 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-public-tls-certs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.842051 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-fernet-keys\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.844453 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-credential-keys\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.844537 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-combined-ca-bundle\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.845218 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-combined-ca-bundle\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.845990 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-internal-tls-certs\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.847376 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/03658323-86f4-42ec-b18f-163a1e7dcaed-public-tls-certs\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.847417 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-config-data\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.850300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7bad355-1a37-4372-9751-25a39f6a3410-scripts\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.852560 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr47t\" (UniqueName: \"kubernetes.io/projected/a7bad355-1a37-4372-9751-25a39f6a3410-kube-api-access-wr47t\") pod \"placement-668885694d-2br7g\" (UID: \"a7bad355-1a37-4372-9751-25a39f6a3410\") " pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.857454 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkvg5\" (UniqueName: \"kubernetes.io/projected/03658323-86f4-42ec-b18f-163a1e7dcaed-kube-api-access-zkvg5\") pod \"keystone-5bdd8cdbd7-xhf92\" (UID: \"03658323-86f4-42ec-b18f-163a1e7dcaed\") " pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:27 crc kubenswrapper[4773]: I0120 18:49:27.983236 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.009683 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.019224 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.169557 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.169865 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.388038 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cffc5d9-m6wn7" event={"ID":"f98c94f3-5e79-4d1a-9e1f-bab68689f193","Type":"ContainerStarted","Data":"603a0369132b8169aa236a7eddb23ab65c240c9205a1a46e90ac0dbc6a9b8a53"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.388103 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cffc5d9-m6wn7" event={"ID":"f98c94f3-5e79-4d1a-9e1f-bab68689f193","Type":"ContainerStarted","Data":"17ebf8342cfe8d01a6fd735b5a31bd03a6a1a8d05aa665fb5a56f6d84446f1e3"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.388113 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-76cffc5d9-m6wn7" event={"ID":"f98c94f3-5e79-4d1a-9e1f-bab68689f193","Type":"ContainerStarted","Data":"22592765aa743e2e58e9824992364431013c189c2e5f3c84f0626f7194f20819"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.388801 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.392835 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerStarted","Data":"fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.405572 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d844bb6-6q8ms" event={"ID":"a81115d7-0fb0-4319-9705-0fae198ad70b","Type":"ContainerStarted","Data":"e48d8470ac4794d65c703528edfbd96a6097216611310b92998ca2d3d1878dfe"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.405619 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d844bb6-6q8ms" event={"ID":"a81115d7-0fb0-4319-9705-0fae198ad70b","Type":"ContainerStarted","Data":"60603e41f66c2ee2986bc21b5a42dcb1715cf77af700eb0e4234d5cc16918f16"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.406517 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.412375 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-76cffc5d9-m6wn7" podStartSLOduration=5.412361306 podStartE2EDuration="5.412361306s" podCreationTimestamp="2026-01-20 18:49:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:28.408391341 +0000 UTC m=+1161.330204365" watchObservedRunningTime="2026-01-20 18:49:28.412361306 +0000 UTC m=+1161.334174330" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.413517 4773 generic.go:334] "Generic (PLEG): container finished" podID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerID="37609d6f106c18914801b3d94dcdadbe56b2aa2be4306121dfd04a92610b45bb" exitCode=0 Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.413554 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" event={"ID":"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6","Type":"ContainerDied","Data":"37609d6f106c18914801b3d94dcdadbe56b2aa2be4306121dfd04a92610b45bb"} Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.437105 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-86d844bb6-6q8ms" podStartSLOduration=7.437088618 podStartE2EDuration="7.437088618s" podCreationTimestamp="2026-01-20 18:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:28.434611018 +0000 UTC m=+1161.356424042" watchObservedRunningTime="2026-01-20 18:49:28.437088618 +0000 UTC m=+1161.358901642" Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.681571 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-5bdd8cdbd7-xhf92"] Jan 20 18:49:28 crc kubenswrapper[4773]: I0120 18:49:28.803694 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-668885694d-2br7g"] Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.480503 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.480868 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" event={"ID":"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6","Type":"ContainerStarted","Data":"9d2dd544dda815f8509ae7c395de9245112061e0bafdb50324cd6086b476d3dc"} Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.480888 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-668885694d-2br7g" event={"ID":"a7bad355-1a37-4372-9751-25a39f6a3410","Type":"ContainerStarted","Data":"1a89ef16fbf048ea38800570a484a04f9bfb0028c397443443b40d4433afdf2c"} Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.480902 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-668885694d-2br7g" event={"ID":"a7bad355-1a37-4372-9751-25a39f6a3410","Type":"ContainerStarted","Data":"7c482d480badaefe5bfd0770daa0a78cb9725e4408eee3b70b740b72ca521b09"} Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.496361 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" podStartSLOduration=8.496344624 podStartE2EDuration="8.496344624s" podCreationTimestamp="2026-01-20 18:49:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:29.492910871 +0000 UTC m=+1162.414723925" watchObservedRunningTime="2026-01-20 18:49:29.496344624 +0000 UTC m=+1162.418157648" Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.501989 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bdd8cdbd7-xhf92" event={"ID":"03658323-86f4-42ec-b18f-163a1e7dcaed","Type":"ContainerStarted","Data":"e9ee5adb33e652dcc40a08d63ae35aa59a52df9520f3943c11d103f17bff8109"} Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.502053 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-5bdd8cdbd7-xhf92" event={"ID":"03658323-86f4-42ec-b18f-163a1e7dcaed","Type":"ContainerStarted","Data":"f06a34cd2bd9bdaeabc311e9d39f9baae024a76f2e16d12427dec857969b1e46"} Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.502096 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:49:29 crc kubenswrapper[4773]: I0120 18:49:29.528867 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-5bdd8cdbd7-xhf92" podStartSLOduration=2.528843491 podStartE2EDuration="2.528843491s" podCreationTimestamp="2026-01-20 18:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:29.517891659 +0000 UTC m=+1162.439704703" watchObservedRunningTime="2026-01-20 18:49:29.528843491 +0000 UTC m=+1162.450656525" Jan 20 18:49:30 crc kubenswrapper[4773]: I0120 18:49:30.512751 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-668885694d-2br7g" event={"ID":"a7bad355-1a37-4372-9751-25a39f6a3410","Type":"ContainerStarted","Data":"a400ab9de6c5a28afecfc78465f9061eb58f3340608384562ff2805663806251"} Jan 20 18:49:30 crc kubenswrapper[4773]: I0120 18:49:30.513426 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:30 crc kubenswrapper[4773]: I0120 18:49:30.513462 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:30 crc kubenswrapper[4773]: I0120 18:49:30.519100 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-22fkv" event={"ID":"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b","Type":"ContainerStarted","Data":"1921c6e2c53d5b75992757a0b24e916241c80d607d6e20c9b6226a61ae867455"} Jan 20 18:49:30 crc kubenswrapper[4773]: I0120 18:49:30.538143 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-668885694d-2br7g" podStartSLOduration=3.538123171 podStartE2EDuration="3.538123171s" podCreationTimestamp="2026-01-20 18:49:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:30.532560608 +0000 UTC m=+1163.454373642" watchObservedRunningTime="2026-01-20 18:49:30.538123171 +0000 UTC m=+1163.459936195" Jan 20 18:49:30 crc kubenswrapper[4773]: I0120 18:49:30.559349 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-22fkv" podStartSLOduration=3.175528246 podStartE2EDuration="43.559329878s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="2026-01-20 18:48:48.808673286 +0000 UTC m=+1121.730486310" lastFinishedPulling="2026-01-20 18:49:29.192474918 +0000 UTC m=+1162.114287942" observedRunningTime="2026-01-20 18:49:30.555651509 +0000 UTC m=+1163.477464533" watchObservedRunningTime="2026-01-20 18:49:30.559329878 +0000 UTC m=+1163.481142902" Jan 20 18:49:31 crc kubenswrapper[4773]: I0120 18:49:31.530605 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-z8p6p" event={"ID":"d9eee838-721f-48cc-a5aa-37644a62d846","Type":"ContainerStarted","Data":"afa34648f2d59f0f8d5c41b244e65ca128bc231f7b61f9cc13109a9287149c7d"} Jan 20 18:49:31 crc kubenswrapper[4773]: I0120 18:49:31.549527 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-z8p6p" podStartSLOduration=1.98366716 podStartE2EDuration="44.549508552s" podCreationTimestamp="2026-01-20 18:48:47 +0000 UTC" firstStartedPulling="2026-01-20 18:48:48.702565819 +0000 UTC m=+1121.624378833" lastFinishedPulling="2026-01-20 18:49:31.268407211 +0000 UTC m=+1164.190220225" observedRunningTime="2026-01-20 18:49:31.547306659 +0000 UTC m=+1164.469119703" watchObservedRunningTime="2026-01-20 18:49:31.549508552 +0000 UTC m=+1164.471321576" Jan 20 18:49:36 crc kubenswrapper[4773]: E0120 18:49:36.295274 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ceilometer-central-agent\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/ceilometer-0" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.569547 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-9b66d8476-cqhrd" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.575202 4773 generic.go:334] "Generic (PLEG): container finished" podID="d9eee838-721f-48cc-a5aa-37644a62d846" containerID="afa34648f2d59f0f8d5c41b244e65ca128bc231f7b61f9cc13109a9287149c7d" exitCode=0 Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.575478 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-z8p6p" event={"ID":"d9eee838-721f-48cc-a5aa-37644a62d846","Type":"ContainerDied","Data":"afa34648f2d59f0f8d5c41b244e65ca128bc231f7b61f9cc13109a9287149c7d"} Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.578665 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerStarted","Data":"3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f"} Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.579121 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.579010 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="sg-core" containerID="cri-o://fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21" gracePeriod=30 Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.579045 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="proxy-httpd" containerID="cri-o://3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f" gracePeriod=30 Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.578917 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="ceilometer-notification-agent" containerID="cri-o://2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6" gracePeriod=30 Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.633296 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-68fb89f56b-287lx" podUID="cd9ba14c-8dca-4170-841c-6f5d5fa2b220" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.144:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.144:8443: connect: connection refused" Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.881125 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.927847 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-qwzhh"] Jan 20 18:49:36 crc kubenswrapper[4773]: I0120 18:49:36.928103 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerName="dnsmasq-dns" containerID="cri-o://5cc155e36c13c5b618c2477be4ab590ab510095287b6b608b69635e7105f701d" gracePeriod=10 Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.599324 4773 generic.go:334] "Generic (PLEG): container finished" podID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerID="3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f" exitCode=0 Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.599631 4773 generic.go:334] "Generic (PLEG): container finished" podID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerID="fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21" exitCode=2 Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.599685 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerDied","Data":"3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f"} Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.599713 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerDied","Data":"fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21"} Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.602651 4773 generic.go:334] "Generic (PLEG): container finished" podID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerID="5cc155e36c13c5b618c2477be4ab590ab510095287b6b608b69635e7105f701d" exitCode=0 Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.602732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" event={"ID":"f667f5ef-cefc-40c0-a282-5d502cd45cd2","Type":"ContainerDied","Data":"5cc155e36c13c5b618c2477be4ab590ab510095287b6b608b69635e7105f701d"} Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.602793 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" event={"ID":"f667f5ef-cefc-40c0-a282-5d502cd45cd2","Type":"ContainerDied","Data":"454cbc1be6d71f7e2439e8567821a33608afd38b017a46c05057d7d0155426fd"} Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.602814 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="454cbc1be6d71f7e2439e8567821a33608afd38b017a46c05057d7d0155426fd" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.605485 4773 generic.go:334] "Generic (PLEG): container finished" podID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" containerID="1921c6e2c53d5b75992757a0b24e916241c80d607d6e20c9b6226a61ae867455" exitCode=0 Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.605640 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-22fkv" event={"ID":"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b","Type":"ContainerDied","Data":"1921c6e2c53d5b75992757a0b24e916241c80d607d6e20c9b6226a61ae867455"} Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.608506 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.731655 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48nn2\" (UniqueName: \"kubernetes.io/projected/f667f5ef-cefc-40c0-a282-5d502cd45cd2-kube-api-access-48nn2\") pod \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.731887 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-sb\") pod \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.731946 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-nb\") pod \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.731993 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-dns-svc\") pod \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.732012 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config\") pod \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.740910 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f667f5ef-cefc-40c0-a282-5d502cd45cd2-kube-api-access-48nn2" (OuterVolumeSpecName: "kube-api-access-48nn2") pod "f667f5ef-cefc-40c0-a282-5d502cd45cd2" (UID: "f667f5ef-cefc-40c0-a282-5d502cd45cd2"). InnerVolumeSpecName "kube-api-access-48nn2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.784164 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f667f5ef-cefc-40c0-a282-5d502cd45cd2" (UID: "f667f5ef-cefc-40c0-a282-5d502cd45cd2"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.808602 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f667f5ef-cefc-40c0-a282-5d502cd45cd2" (UID: "f667f5ef-cefc-40c0-a282-5d502cd45cd2"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.837686 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config" (OuterVolumeSpecName: "config") pod "f667f5ef-cefc-40c0-a282-5d502cd45cd2" (UID: "f667f5ef-cefc-40c0-a282-5d502cd45cd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.850587 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config\") pod \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\" (UID: \"f667f5ef-cefc-40c0-a282-5d502cd45cd2\") " Jan 20 18:49:37 crc kubenswrapper[4773]: W0120 18:49:37.851306 4773 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/f667f5ef-cefc-40c0-a282-5d502cd45cd2/volumes/kubernetes.io~configmap/config Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.851324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config" (OuterVolumeSpecName: "config") pod "f667f5ef-cefc-40c0-a282-5d502cd45cd2" (UID: "f667f5ef-cefc-40c0-a282-5d502cd45cd2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.857321 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48nn2\" (UniqueName: \"kubernetes.io/projected/f667f5ef-cefc-40c0-a282-5d502cd45cd2-kube-api-access-48nn2\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.857356 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.857369 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.857380 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.860870 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f667f5ef-cefc-40c0-a282-5d502cd45cd2" (UID: "f667f5ef-cefc-40c0-a282-5d502cd45cd2"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.928696 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.958999 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-combined-ca-bundle\") pod \"d9eee838-721f-48cc-a5aa-37644a62d846\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.959189 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-db-sync-config-data\") pod \"d9eee838-721f-48cc-a5aa-37644a62d846\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.959309 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kjprd\" (UniqueName: \"kubernetes.io/projected/d9eee838-721f-48cc-a5aa-37644a62d846-kube-api-access-kjprd\") pod \"d9eee838-721f-48cc-a5aa-37644a62d846\" (UID: \"d9eee838-721f-48cc-a5aa-37644a62d846\") " Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.959743 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f667f5ef-cefc-40c0-a282-5d502cd45cd2-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.963393 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9eee838-721f-48cc-a5aa-37644a62d846-kube-api-access-kjprd" (OuterVolumeSpecName: "kube-api-access-kjprd") pod "d9eee838-721f-48cc-a5aa-37644a62d846" (UID: "d9eee838-721f-48cc-a5aa-37644a62d846"). InnerVolumeSpecName "kube-api-access-kjprd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.963715 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d9eee838-721f-48cc-a5aa-37644a62d846" (UID: "d9eee838-721f-48cc-a5aa-37644a62d846"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:37 crc kubenswrapper[4773]: I0120 18:49:37.984142 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d9eee838-721f-48cc-a5aa-37644a62d846" (UID: "d9eee838-721f-48cc-a5aa-37644a62d846"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.062031 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kjprd\" (UniqueName: \"kubernetes.io/projected/d9eee838-721f-48cc-a5aa-37644a62d846-kube-api-access-kjprd\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.062272 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.062332 4773 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d9eee838-721f-48cc-a5aa-37644a62d846-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.622995 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-z8p6p" event={"ID":"d9eee838-721f-48cc-a5aa-37644a62d846","Type":"ContainerDied","Data":"aef79cffc84cc91679ae3ca7b132c652022d5b062e4c613e53887b19256d63c1"} Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.623087 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aef79cffc84cc91679ae3ca7b132c652022d5b062e4c613e53887b19256d63c1" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.623308 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7987f74bbc-qwzhh" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.623356 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-z8p6p" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.678696 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-qwzhh"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.698755 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7987f74bbc-qwzhh"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.776391 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-69f4d99ff7-gmlhl"] Jan 20 18:49:38 crc kubenswrapper[4773]: E0120 18:49:38.776749 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerName="init" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.776766 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerName="init" Jan 20 18:49:38 crc kubenswrapper[4773]: E0120 18:49:38.776780 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9eee838-721f-48cc-a5aa-37644a62d846" containerName="barbican-db-sync" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.776787 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9eee838-721f-48cc-a5aa-37644a62d846" containerName="barbican-db-sync" Jan 20 18:49:38 crc kubenswrapper[4773]: E0120 18:49:38.776805 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerName="dnsmasq-dns" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.776812 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerName="dnsmasq-dns" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.777003 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9eee838-721f-48cc-a5aa-37644a62d846" containerName="barbican-db-sync" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.777024 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" containerName="dnsmasq-dns" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.777865 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.780536 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-rvh6j" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.780700 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.780810 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.797616 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69f4d99ff7-gmlhl"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.815754 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-7854d7cd94-r9cm7"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.817258 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.821441 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.879725 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kl89h\" (UniqueName: \"kubernetes.io/projected/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-kube-api-access-kl89h\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.879787 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46jxs\" (UniqueName: \"kubernetes.io/projected/8839acb4-5db9-4b47-a075-8798d8a01c6b-kube-api-access-46jxs\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.879820 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-logs\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.879882 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-config-data\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.879911 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8839acb4-5db9-4b47-a075-8798d8a01c6b-logs\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.879997 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-combined-ca-bundle\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.880044 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-combined-ca-bundle\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.880096 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-config-data-custom\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.880119 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-config-data\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.880165 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-config-data-custom\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.917635 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7854d7cd94-r9cm7"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.953992 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wlpl8"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.955785 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.972011 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wlpl8"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.981950 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kl89h\" (UniqueName: \"kubernetes.io/projected/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-kube-api-access-kl89h\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982024 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46jxs\" (UniqueName: \"kubernetes.io/projected/8839acb4-5db9-4b47-a075-8798d8a01c6b-kube-api-access-46jxs\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982056 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-logs\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982136 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982164 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-config-data\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982206 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8839acb4-5db9-4b47-a075-8798d8a01c6b-logs\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982226 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-combined-ca-bundle\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982274 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vlwq\" (UniqueName: \"kubernetes.io/projected/994b4766-5a87-44ae-b271-f16a3be4fda0-kube-api-access-4vlwq\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982317 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-combined-ca-bundle\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982365 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-config-data-custom\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982388 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-config-data\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982431 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-config-data-custom\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982516 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-dns-svc\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982546 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.982608 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-config\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.984221 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-logs\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.985271 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8839acb4-5db9-4b47-a075-8798d8a01c6b-logs\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.989628 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-5d8b6458bb-fc8lw"] Jan 20 18:49:38 crc kubenswrapper[4773]: I0120 18:49:38.999457 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-config-data\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.000053 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d8b6458bb-fc8lw"] Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.000403 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.000795 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46jxs\" (UniqueName: \"kubernetes.io/projected/8839acb4-5db9-4b47-a075-8798d8a01c6b-kube-api-access-46jxs\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.002047 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-combined-ca-bundle\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.002296 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-combined-ca-bundle\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.002569 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.005238 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kl89h\" (UniqueName: \"kubernetes.io/projected/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-kube-api-access-kl89h\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.005831 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-config-data\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.005958 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782-config-data-custom\") pod \"barbican-worker-69f4d99ff7-gmlhl\" (UID: \"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782\") " pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.003575 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8839acb4-5db9-4b47-a075-8798d8a01c6b-config-data-custom\") pod \"barbican-keystone-listener-7854d7cd94-r9cm7\" (UID: \"8839acb4-5db9-4b47-a075-8798d8a01c6b\") " pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.086987 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087162 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vlwq\" (UniqueName: \"kubernetes.io/projected/994b4766-5a87-44ae-b271-f16a3be4fda0-kube-api-access-4vlwq\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087304 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-logs\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087387 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-dns-svc\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087460 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087559 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-config\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087637 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-combined-ca-bundle\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087784 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.087873 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data-custom\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.088052 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8bj7\" (UniqueName: \"kubernetes.io/projected/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-kube-api-access-h8bj7\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.088900 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-nb\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.089506 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-sb\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.090156 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-config\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.095373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-dns-svc\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.105303 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-69f4d99ff7-gmlhl" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.112405 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vlwq\" (UniqueName: \"kubernetes.io/projected/994b4766-5a87-44ae-b271-f16a3be4fda0-kube-api-access-4vlwq\") pod \"dnsmasq-dns-6bb684768f-wlpl8\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.198986 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.199075 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data-custom\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.199107 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8bj7\" (UniqueName: \"kubernetes.io/projected/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-kube-api-access-h8bj7\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.199235 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-logs\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.199295 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-combined-ca-bundle\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.207134 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-logs\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.215664 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data-custom\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.220961 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-combined-ca-bundle\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.231162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.242529 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.250611 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8bj7\" (UniqueName: \"kubernetes.io/projected/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-kube-api-access-h8bj7\") pod \"barbican-api-5d8b6458bb-fc8lw\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.273854 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.286830 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-22fkv" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.289184 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.299990 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-combined-ca-bundle\") pod \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.300135 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tdgkq\" (UniqueName: \"kubernetes.io/projected/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-kube-api-access-tdgkq\") pod \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.300253 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-db-sync-config-data\") pod \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.300308 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-config-data\") pod \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.300346 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-scripts\") pod \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.300379 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-etc-machine-id\") pod \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\" (UID: \"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b\") " Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.301053 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" (UID: "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.311127 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" (UID: "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.322086 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-scripts" (OuterVolumeSpecName: "scripts") pod "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" (UID: "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.332170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-kube-api-access-tdgkq" (OuterVolumeSpecName: "kube-api-access-tdgkq") pod "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" (UID: "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b"). InnerVolumeSpecName "kube-api-access-tdgkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.353003 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" (UID: "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.389248 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-config-data" (OuterVolumeSpecName: "config-data") pod "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" (UID: "3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.411601 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.411643 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tdgkq\" (UniqueName: \"kubernetes.io/projected/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-kube-api-access-tdgkq\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.411853 4773 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.411865 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.411875 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.411886 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.475254 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f667f5ef-cefc-40c0-a282-5d502cd45cd2" path="/var/lib/kubelet/pods/f667f5ef-cefc-40c0-a282-5d502cd45cd2/volumes" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.645893 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-22fkv" event={"ID":"3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b","Type":"ContainerDied","Data":"2518423b901342e321aef89bcd042955aa2c984bc05c23caf50e31eea1f65a99"} Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.646235 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2518423b901342e321aef89bcd042955aa2c984bc05c23caf50e31eea1f65a99" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.646156 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-22fkv" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.892390 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-7854d7cd94-r9cm7"] Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.939989 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-5d8b6458bb-fc8lw"] Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.967045 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-69f4d99ff7-gmlhl"] Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.989825 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:39 crc kubenswrapper[4773]: E0120 18:49:39.990316 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" containerName="cinder-db-sync" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.990356 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" containerName="cinder-db-sync" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.990570 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" containerName="cinder-db-sync" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.991705 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 20 18:49:39 crc kubenswrapper[4773]: I0120 18:49:39.998924 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.000732 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.001863 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-fjv2n" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.002130 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.022219 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.023170 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.023208 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbtjz\" (UniqueName: \"kubernetes.io/projected/dfe8dc2b-eac6-4606-9b32-848e3a273eef-kube-api-access-qbtjz\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.023235 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.023269 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-scripts\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.023330 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfe8dc2b-eac6-4606-9b32-848e3a273eef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.023454 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.038912 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wlpl8"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.058733 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wlpl8"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.114736 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-fjqmj"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.116127 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136056 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136121 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136177 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qbtjz\" (UniqueName: \"kubernetes.io/projected/dfe8dc2b-eac6-4606-9b32-848e3a273eef-kube-api-access-qbtjz\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136215 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136377 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-scripts\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136476 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfe8dc2b-eac6-4606-9b32-848e3a273eef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.136587 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfe8dc2b-eac6-4606-9b32-848e3a273eef-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.146086 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.149887 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-scripts\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.157984 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.173263 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qbtjz\" (UniqueName: \"kubernetes.io/projected/dfe8dc2b-eac6-4606-9b32-848e3a273eef-kube-api-access-qbtjz\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.173332 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-fjqmj"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.173745 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.201981 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.203256 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.205327 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.217653 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238519 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25154706-fb3d-45e9-b041-a925b21cf99e-logs\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238571 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data-custom\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238595 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238652 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238697 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-config\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238724 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238761 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238783 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99tvf\" (UniqueName: \"kubernetes.io/projected/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-kube-api-access-99tvf\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238842 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238876 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25154706-fb3d-45e9-b041-a925b21cf99e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238903 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-scripts\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.238970 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t458l\" (UniqueName: \"kubernetes.io/projected/25154706-fb3d-45e9-b041-a925b21cf99e-kube-api-access-t458l\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.339005 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341098 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341162 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-config\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341190 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341221 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341240 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99tvf\" (UniqueName: \"kubernetes.io/projected/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-kube-api-access-99tvf\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341289 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341313 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25154706-fb3d-45e9-b041-a925b21cf99e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341336 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-scripts\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341351 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t458l\" (UniqueName: \"kubernetes.io/projected/25154706-fb3d-45e9-b041-a925b21cf99e-kube-api-access-t458l\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341380 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25154706-fb3d-45e9-b041-a925b21cf99e-logs\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341398 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data-custom\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.341414 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.342027 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-sb\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.342563 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25154706-fb3d-45e9-b041-a925b21cf99e-etc-machine-id\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.342896 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-dns-svc\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.343559 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-config\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.343889 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-nb\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.344101 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25154706-fb3d-45e9-b041-a925b21cf99e-logs\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.358414 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-scripts\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.371300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.387358 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.387971 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data-custom\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.393652 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99tvf\" (UniqueName: \"kubernetes.io/projected/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-kube-api-access-99tvf\") pod \"dnsmasq-dns-6d97fcdd8f-fjqmj\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.410733 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t458l\" (UniqueName: \"kubernetes.io/projected/25154706-fb3d-45e9-b041-a925b21cf99e-kube-api-access-t458l\") pod \"cinder-api-0\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.560214 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.586338 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.671969 4773 generic.go:334] "Generic (PLEG): container finished" podID="994b4766-5a87-44ae-b271-f16a3be4fda0" containerID="401b96609dc8783cb66cb086019d1922425f49f14cc9bffc456fd7c95238a026" exitCode=0 Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.672395 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" event={"ID":"994b4766-5a87-44ae-b271-f16a3be4fda0","Type":"ContainerDied","Data":"401b96609dc8783cb66cb086019d1922425f49f14cc9bffc456fd7c95238a026"} Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.672424 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" event={"ID":"994b4766-5a87-44ae-b271-f16a3be4fda0","Type":"ContainerStarted","Data":"5aba09a2ca6f3a6f60e7c18e4f45334529029f8eca1484896a33d9404a7b35ce"} Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.678217 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" event={"ID":"8839acb4-5db9-4b47-a075-8798d8a01c6b","Type":"ContainerStarted","Data":"f52ae7a3134779398162e2cc2833f6ac6a66739dae8079844ccb344871f07b8a"} Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.679544 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69f4d99ff7-gmlhl" event={"ID":"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782","Type":"ContainerStarted","Data":"6bf46e026e5abc132b665eb31417655ab9af52f70306a4f98061d935daaa2c93"} Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.713798 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8b6458bb-fc8lw" event={"ID":"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2","Type":"ContainerStarted","Data":"1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5"} Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.713842 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8b6458bb-fc8lw" event={"ID":"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2","Type":"ContainerStarted","Data":"1ec65c4d727460ec393332a2c46d1ecf2c32fcbc331c29099e6ae31f70e9f10d"} Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.714950 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.714974 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.747232 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-5d8b6458bb-fc8lw" podStartSLOduration=2.747210794 podStartE2EDuration="2.747210794s" podCreationTimestamp="2026-01-20 18:49:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:40.738471725 +0000 UTC m=+1173.660284759" watchObservedRunningTime="2026-01-20 18:49:40.747210794 +0000 UTC m=+1173.669023828" Jan 20 18:49:40 crc kubenswrapper[4773]: I0120 18:49:40.883106 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:40 crc kubenswrapper[4773]: W0120 18:49:40.904158 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfe8dc2b_eac6_4606_9b32_848e3a273eef.slice/crio-b33bd0d0bb49059043d7ae03c783cf24e29ed3294a97bf332d9d93097e591133 WatchSource:0}: Error finding container b33bd0d0bb49059043d7ae03c783cf24e29ed3294a97bf332d9d93097e591133: Status 404 returned error can't find the container with id b33bd0d0bb49059043d7ae03c783cf24e29ed3294a97bf332d9d93097e591133 Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.352014 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-fjqmj"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.399272 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.488649 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.500073 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.577668 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-config-data\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.577744 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mrxd\" (UniqueName: \"kubernetes.io/projected/077faa57-a75d-4f1a-b01e-3fc69ddb5761-kube-api-access-9mrxd\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.577779 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-sg-core-conf-yaml\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.577878 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vlwq\" (UniqueName: \"kubernetes.io/projected/994b4766-5a87-44ae-b271-f16a3be4fda0-kube-api-access-4vlwq\") pod \"994b4766-5a87-44ae-b271-f16a3be4fda0\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.577979 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-log-httpd\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578030 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-run-httpd\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578057 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-scripts\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578116 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-combined-ca-bundle\") pod \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\" (UID: \"077faa57-a75d-4f1a-b01e-3fc69ddb5761\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578153 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-nb\") pod \"994b4766-5a87-44ae-b271-f16a3be4fda0\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578170 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-sb\") pod \"994b4766-5a87-44ae-b271-f16a3be4fda0\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578196 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-dns-svc\") pod \"994b4766-5a87-44ae-b271-f16a3be4fda0\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578214 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-config\") pod \"994b4766-5a87-44ae-b271-f16a3be4fda0\" (UID: \"994b4766-5a87-44ae-b271-f16a3be4fda0\") " Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.578844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.580809 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.582646 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/077faa57-a75d-4f1a-b01e-3fc69ddb5761-kube-api-access-9mrxd" (OuterVolumeSpecName: "kube-api-access-9mrxd") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "kube-api-access-9mrxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.584411 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/994b4766-5a87-44ae-b271-f16a3be4fda0-kube-api-access-4vlwq" (OuterVolumeSpecName: "kube-api-access-4vlwq") pod "994b4766-5a87-44ae-b271-f16a3be4fda0" (UID: "994b4766-5a87-44ae-b271-f16a3be4fda0"). InnerVolumeSpecName "kube-api-access-4vlwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.587142 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-scripts" (OuterVolumeSpecName: "scripts") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.604576 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-config" (OuterVolumeSpecName: "config") pod "994b4766-5a87-44ae-b271-f16a3be4fda0" (UID: "994b4766-5a87-44ae-b271-f16a3be4fda0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.605399 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.607572 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "994b4766-5a87-44ae-b271-f16a3be4fda0" (UID: "994b4766-5a87-44ae-b271-f16a3be4fda0"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.610301 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "994b4766-5a87-44ae-b271-f16a3be4fda0" (UID: "994b4766-5a87-44ae-b271-f16a3be4fda0"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.612653 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "994b4766-5a87-44ae-b271-f16a3be4fda0" (UID: "994b4766-5a87-44ae-b271-f16a3be4fda0"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.634448 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.656922 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-config-data" (OuterVolumeSpecName: "config-data") pod "077faa57-a75d-4f1a-b01e-3fc69ddb5761" (UID: "077faa57-a75d-4f1a-b01e-3fc69ddb5761"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.680682 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vlwq\" (UniqueName: \"kubernetes.io/projected/994b4766-5a87-44ae-b271-f16a3be4fda0-kube-api-access-4vlwq\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.681774 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.681904 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/077faa57-a75d-4f1a-b01e-3fc69ddb5761-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682003 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682071 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682133 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682193 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682281 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682347 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/994b4766-5a87-44ae-b271-f16a3be4fda0-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682415 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682471 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mrxd\" (UniqueName: \"kubernetes.io/projected/077faa57-a75d-4f1a-b01e-3fc69ddb5761-kube-api-access-9mrxd\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.682550 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/077faa57-a75d-4f1a-b01e-3fc69ddb5761-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.723948 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfe8dc2b-eac6-4606-9b32-848e3a273eef","Type":"ContainerStarted","Data":"b33bd0d0bb49059043d7ae03c783cf24e29ed3294a97bf332d9d93097e591133"} Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.726396 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8b6458bb-fc8lw" event={"ID":"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2","Type":"ContainerStarted","Data":"165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645"} Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.728831 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.728824 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6bb684768f-wlpl8" event={"ID":"994b4766-5a87-44ae-b271-f16a3be4fda0","Type":"ContainerDied","Data":"5aba09a2ca6f3a6f60e7c18e4f45334529029f8eca1484896a33d9404a7b35ce"} Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.729009 4773 scope.go:117] "RemoveContainer" containerID="401b96609dc8783cb66cb086019d1922425f49f14cc9bffc456fd7c95238a026" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.731439 4773 generic.go:334] "Generic (PLEG): container finished" podID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerID="2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6" exitCode=0 Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.731551 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerDied","Data":"2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6"} Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.731592 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"077faa57-a75d-4f1a-b01e-3fc69ddb5761","Type":"ContainerDied","Data":"d0e8ddb6dbdcbfbf1e26c6a891d80cf5f965501af473ae07fda9dcc295cac646"} Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.731516 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.733394 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" event={"ID":"93029dbe-6bb4-45aa-a72a-13e4ffc2537e","Type":"ContainerStarted","Data":"61e0e72231f57915e14eb84dde43eba2a10986a7eb1f548177c2131ae5e71eff"} Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.830911 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.853671 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876104 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:49:41 crc kubenswrapper[4773]: E0120 18:49:41.876526 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="sg-core" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876545 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="sg-core" Jan 20 18:49:41 crc kubenswrapper[4773]: E0120 18:49:41.876576 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="ceilometer-notification-agent" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876583 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="ceilometer-notification-agent" Jan 20 18:49:41 crc kubenswrapper[4773]: E0120 18:49:41.876598 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994b4766-5a87-44ae-b271-f16a3be4fda0" containerName="init" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876604 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="994b4766-5a87-44ae-b271-f16a3be4fda0" containerName="init" Jan 20 18:49:41 crc kubenswrapper[4773]: E0120 18:49:41.876621 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="proxy-httpd" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876627 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="proxy-httpd" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876804 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="proxy-httpd" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876820 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="sg-core" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876827 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="994b4766-5a87-44ae-b271-f16a3be4fda0" containerName="init" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.876840 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" containerName="ceilometer-notification-agent" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.880678 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.886157 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.886429 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.889745 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wlpl8"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.897162 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6bb684768f-wlpl8"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.903681 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989189 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-log-httpd\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989243 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-run-httpd\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989385 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckpjr\" (UniqueName: \"kubernetes.io/projected/f5814cea-a704-4de4-9205-d65cde58c777-kube-api-access-ckpjr\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989555 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-scripts\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989685 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-config-data\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:41 crc kubenswrapper[4773]: I0120 18:49:41.989780 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092181 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-log-httpd\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092239 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-run-httpd\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092284 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckpjr\" (UniqueName: \"kubernetes.io/projected/f5814cea-a704-4de4-9205-d65cde58c777-kube-api-access-ckpjr\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092328 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-scripts\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092420 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-config-data\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.092449 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.093877 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-log-httpd\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.093911 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-run-httpd\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.098512 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.101385 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-config-data\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.105792 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.109631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-scripts\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.114521 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckpjr\" (UniqueName: \"kubernetes.io/projected/f5814cea-a704-4de4-9205-d65cde58c777-kube-api-access-ckpjr\") pod \"ceilometer-0\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.124321 4773 scope.go:117] "RemoveContainer" containerID="3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.202855 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.627143 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:42 crc kubenswrapper[4773]: I0120 18:49:42.743814 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"25154706-fb3d-45e9-b041-a925b21cf99e","Type":"ContainerStarted","Data":"7bc4da783a4581936a2d2574fb4406eecf3d9e58a1b1e68a3dbf5ed59a34cc62"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.124630 4773 scope.go:117] "RemoveContainer" containerID="fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.348195 4773 scope.go:117] "RemoveContainer" containerID="2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.418789 4773 scope.go:117] "RemoveContainer" containerID="3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f" Jan 20 18:49:43 crc kubenswrapper[4773]: E0120 18:49:43.419478 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f\": container with ID starting with 3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f not found: ID does not exist" containerID="3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.419523 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f"} err="failed to get container status \"3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f\": rpc error: code = NotFound desc = could not find container \"3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f\": container with ID starting with 3c965e43ccdac97428ce352b09d0620843cca3f25e74ef924d54e1c23f72ff4f not found: ID does not exist" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.419549 4773 scope.go:117] "RemoveContainer" containerID="fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21" Jan 20 18:49:43 crc kubenswrapper[4773]: E0120 18:49:43.419847 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21\": container with ID starting with fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21 not found: ID does not exist" containerID="fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.419880 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21"} err="failed to get container status \"fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21\": rpc error: code = NotFound desc = could not find container \"fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21\": container with ID starting with fd2c8acd5b6ce4e2c62657068d76e1f3c65f9b1f2c3ff9b693acaab84e837b21 not found: ID does not exist" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.419902 4773 scope.go:117] "RemoveContainer" containerID="2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6" Jan 20 18:49:43 crc kubenswrapper[4773]: E0120 18:49:43.422103 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6\": container with ID starting with 2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6 not found: ID does not exist" containerID="2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.422128 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6"} err="failed to get container status \"2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6\": rpc error: code = NotFound desc = could not find container \"2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6\": container with ID starting with 2e60f5b100091f51d40096ed81ead834e3a8b767169abe81d9bad772de6d4ab6 not found: ID does not exist" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.459703 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077faa57-a75d-4f1a-b01e-3fc69ddb5761" path="/var/lib/kubelet/pods/077faa57-a75d-4f1a-b01e-3fc69ddb5761/volumes" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.460548 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="994b4766-5a87-44ae-b271-f16a3be4fda0" path="/var/lib/kubelet/pods/994b4766-5a87-44ae-b271-f16a3be4fda0/volumes" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.593168 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:49:43 crc kubenswrapper[4773]: W0120 18:49:43.614178 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-88338d24506cf39cc9754f352bf93432e909359610b5305d799ad52d2dc0901c WatchSource:0}: Error finding container 88338d24506cf39cc9754f352bf93432e909359610b5305d799ad52d2dc0901c: Status 404 returned error can't find the container with id 88338d24506cf39cc9754f352bf93432e909359610b5305d799ad52d2dc0901c Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.766233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerStarted","Data":"88338d24506cf39cc9754f352bf93432e909359610b5305d799ad52d2dc0901c"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.772966 4773 generic.go:334] "Generic (PLEG): container finished" podID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerID="5b0075b2e498f8ecabb538d3ba4a3dfe5c7da84ee028f9b1a8729de23849abd7" exitCode=0 Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.773026 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" event={"ID":"93029dbe-6bb4-45aa-a72a-13e4ffc2537e","Type":"ContainerDied","Data":"5b0075b2e498f8ecabb538d3ba4a3dfe5c7da84ee028f9b1a8729de23849abd7"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.781264 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" event={"ID":"8839acb4-5db9-4b47-a075-8798d8a01c6b","Type":"ContainerStarted","Data":"0751dc63fe825a3449a0eceb64da4673614dadccbb05d1a7ccd9706d5c0fd15f"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.781312 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" event={"ID":"8839acb4-5db9-4b47-a075-8798d8a01c6b","Type":"ContainerStarted","Data":"01410284b151c46472fe04706a119e67d310c83c3c265ae4de03940b556a1357"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.794705 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69f4d99ff7-gmlhl" event={"ID":"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782","Type":"ContainerStarted","Data":"ce9ee5db281cb21f70d3c2098c9e982b9c7d1d729753eb35415a8a3ebb63c256"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.794758 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-69f4d99ff7-gmlhl" event={"ID":"52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782","Type":"ContainerStarted","Data":"7872eba03b0aed7130a58d220a89f78ab5744de8fc73b616bc3a4d5cca5c396d"} Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.821071 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-69f4d99ff7-gmlhl" podStartSLOduration=2.674613618 podStartE2EDuration="5.821051787s" podCreationTimestamp="2026-01-20 18:49:38 +0000 UTC" firstStartedPulling="2026-01-20 18:49:39.978544035 +0000 UTC m=+1172.900357059" lastFinishedPulling="2026-01-20 18:49:43.124982204 +0000 UTC m=+1176.046795228" observedRunningTime="2026-01-20 18:49:43.810505545 +0000 UTC m=+1176.732318569" watchObservedRunningTime="2026-01-20 18:49:43.821051787 +0000 UTC m=+1176.742864811" Jan 20 18:49:43 crc kubenswrapper[4773]: I0120 18:49:43.841610 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-7854d7cd94-r9cm7" podStartSLOduration=2.610208797 podStartE2EDuration="5.841590288s" podCreationTimestamp="2026-01-20 18:49:38 +0000 UTC" firstStartedPulling="2026-01-20 18:49:39.907528587 +0000 UTC m=+1172.829341611" lastFinishedPulling="2026-01-20 18:49:43.138910078 +0000 UTC m=+1176.060723102" observedRunningTime="2026-01-20 18:49:43.830655327 +0000 UTC m=+1176.752468361" watchObservedRunningTime="2026-01-20 18:49:43.841590288 +0000 UTC m=+1176.763403312" Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.806261 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerStarted","Data":"4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77"} Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.808943 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"25154706-fb3d-45e9-b041-a925b21cf99e","Type":"ContainerStarted","Data":"3f943e097fa810b3897bd9ae41c4035dc93cd244494d95920a4c04a210f1040c"} Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.808986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"25154706-fb3d-45e9-b041-a925b21cf99e","Type":"ContainerStarted","Data":"5feaddc7a80a9cd682823dab4b45ded5d8506dd50a192b4ddd215f92f9af4fe4"} Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.810302 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfe8dc2b-eac6-4606-9b32-848e3a273eef","Type":"ContainerStarted","Data":"68c83c6be4036bcce359a81115c68f082df5c191d32d4f4fd5fd81b5476cc0b7"} Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.812088 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" event={"ID":"93029dbe-6bb4-45aa-a72a-13e4ffc2537e","Type":"ContainerStarted","Data":"9e423ba2df684bfe239dce19b30ffd19312d8fa87386a3c9a43e0fb199ccc323"} Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.813364 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:44 crc kubenswrapper[4773]: I0120 18:49:44.832404 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" podStartSLOduration=4.832380498 podStartE2EDuration="4.832380498s" podCreationTimestamp="2026-01-20 18:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:44.829594711 +0000 UTC m=+1177.751407745" watchObservedRunningTime="2026-01-20 18:49:44.832380498 +0000 UTC m=+1177.754193522" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.845250 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfe8dc2b-eac6-4606-9b32-848e3a273eef","Type":"ContainerStarted","Data":"2547f7957a36ac52b73e0306b0c716ec107a69cbf1dfb1609ce1a97df9bb6d4b"} Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.856818 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerStarted","Data":"7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e"} Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.856974 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api-log" containerID="cri-o://5feaddc7a80a9cd682823dab4b45ded5d8506dd50a192b4ddd215f92f9af4fe4" gracePeriod=30 Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.857088 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.857133 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api" containerID="cri-o://3f943e097fa810b3897bd9ae41c4035dc93cd244494d95920a4c04a210f1040c" gracePeriod=30 Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.878739 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-7574cb8f94-wwkgd"] Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.886579 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.891048 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.891691 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.894725 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.453429887 podStartE2EDuration="6.894707007s" podCreationTimestamp="2026-01-20 18:49:39 +0000 UTC" firstStartedPulling="2026-01-20 18:49:40.906907712 +0000 UTC m=+1173.828720736" lastFinishedPulling="2026-01-20 18:49:43.348184832 +0000 UTC m=+1176.269997856" observedRunningTime="2026-01-20 18:49:45.875802715 +0000 UTC m=+1178.797615749" watchObservedRunningTime="2026-01-20 18:49:45.894707007 +0000 UTC m=+1178.816520031" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.923450 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.923424164 podStartE2EDuration="5.923424164s" podCreationTimestamp="2026-01-20 18:49:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:45.895385854 +0000 UTC m=+1178.817198878" watchObservedRunningTime="2026-01-20 18:49:45.923424164 +0000 UTC m=+1178.845237188" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.928073 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7574cb8f94-wwkgd"] Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982421 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7dmfd\" (UniqueName: \"kubernetes.io/projected/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-kube-api-access-7dmfd\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982500 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-internal-tls-certs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982571 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-logs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982622 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-combined-ca-bundle\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982661 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-public-tls-certs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982682 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-config-data-custom\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:45 crc kubenswrapper[4773]: I0120 18:49:45.982718 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-config-data\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.084682 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-config-data-custom\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.084746 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-config-data\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.084835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7dmfd\" (UniqueName: \"kubernetes.io/projected/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-kube-api-access-7dmfd\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.084896 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-internal-tls-certs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.084964 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-logs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.085004 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-combined-ca-bundle\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.085032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-public-tls-certs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.085922 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-logs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.090337 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-config-data-custom\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.091122 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-internal-tls-certs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.091470 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-combined-ca-bundle\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.094652 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-public-tls-certs\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.095631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-config-data\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.110538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7dmfd\" (UniqueName: \"kubernetes.io/projected/436dcd32-51a0-4a9e-8a0a-fb852a5de1f0-kube-api-access-7dmfd\") pod \"barbican-api-7574cb8f94-wwkgd\" (UID: \"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0\") " pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.206490 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.685168 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-7574cb8f94-wwkgd"] Jan 20 18:49:46 crc kubenswrapper[4773]: W0120 18:49:46.688314 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod436dcd32_51a0_4a9e_8a0a_fb852a5de1f0.slice/crio-74347585b555b25673c65774dad8c78e19bdd025e2d4214c8a3275d49d8501dd WatchSource:0}: Error finding container 74347585b555b25673c65774dad8c78e19bdd025e2d4214c8a3275d49d8501dd: Status 404 returned error can't find the container with id 74347585b555b25673c65774dad8c78e19bdd025e2d4214c8a3275d49d8501dd Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.865454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7574cb8f94-wwkgd" event={"ID":"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0","Type":"ContainerStarted","Data":"74347585b555b25673c65774dad8c78e19bdd025e2d4214c8a3275d49d8501dd"} Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.868034 4773 generic.go:334] "Generic (PLEG): container finished" podID="25154706-fb3d-45e9-b041-a925b21cf99e" containerID="3f943e097fa810b3897bd9ae41c4035dc93cd244494d95920a4c04a210f1040c" exitCode=0 Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.868069 4773 generic.go:334] "Generic (PLEG): container finished" podID="25154706-fb3d-45e9-b041-a925b21cf99e" containerID="5feaddc7a80a9cd682823dab4b45ded5d8506dd50a192b4ddd215f92f9af4fe4" exitCode=143 Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.868083 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"25154706-fb3d-45e9-b041-a925b21cf99e","Type":"ContainerDied","Data":"3f943e097fa810b3897bd9ae41c4035dc93cd244494d95920a4c04a210f1040c"} Jan 20 18:49:46 crc kubenswrapper[4773]: I0120 18:49:46.868123 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"25154706-fb3d-45e9-b041-a925b21cf99e","Type":"ContainerDied","Data":"5feaddc7a80a9cd682823dab4b45ded5d8506dd50a192b4ddd215f92f9af4fe4"} Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.863919 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920561 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t458l\" (UniqueName: \"kubernetes.io/projected/25154706-fb3d-45e9-b041-a925b21cf99e-kube-api-access-t458l\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920621 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-scripts\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920697 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25154706-fb3d-45e9-b041-a925b21cf99e-etc-machine-id\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920738 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920847 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data-custom\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920884 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25154706-fb3d-45e9-b041-a925b21cf99e-logs\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.920901 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-combined-ca-bundle\") pod \"25154706-fb3d-45e9-b041-a925b21cf99e\" (UID: \"25154706-fb3d-45e9-b041-a925b21cf99e\") " Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.926326 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/25154706-fb3d-45e9-b041-a925b21cf99e-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.926815 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.927279 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"25154706-fb3d-45e9-b041-a925b21cf99e","Type":"ContainerDied","Data":"7bc4da783a4581936a2d2574fb4406eecf3d9e58a1b1e68a3dbf5ed59a34cc62"} Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.927322 4773 scope.go:117] "RemoveContainer" containerID="3f943e097fa810b3897bd9ae41c4035dc93cd244494d95920a4c04a210f1040c" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.927720 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/25154706-fb3d-45e9-b041-a925b21cf99e-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.928096 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25154706-fb3d-45e9-b041-a925b21cf99e-logs" (OuterVolumeSpecName: "logs") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938178 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938252 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25154706-fb3d-45e9-b041-a925b21cf99e-kube-api-access-t458l" (OuterVolumeSpecName: "kube-api-access-t458l") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "kube-api-access-t458l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938349 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-scripts" (OuterVolumeSpecName: "scripts") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938431 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7574cb8f94-wwkgd" event={"ID":"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0","Type":"ContainerStarted","Data":"15c3b05a3fb5adde12caec644eef97734f0e971ada82d6d248d8ec5391ee4505"} Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938463 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-7574cb8f94-wwkgd" event={"ID":"436dcd32-51a0-4a9e-8a0a-fb852a5de1f0","Type":"ContainerStarted","Data":"152e8d76edca114872034ad661c3ace516bc072a3a7ac62c230215e5836ad322"} Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938799 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.938838 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.940805 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerStarted","Data":"2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862"} Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.969515 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-7574cb8f94-wwkgd" podStartSLOduration=2.969494874 podStartE2EDuration="2.969494874s" podCreationTimestamp="2026-01-20 18:49:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:47.956716588 +0000 UTC m=+1180.878529602" watchObservedRunningTime="2026-01-20 18:49:47.969494874 +0000 UTC m=+1180.891307898" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.971194 4773 scope.go:117] "RemoveContainer" containerID="5feaddc7a80a9cd682823dab4b45ded5d8506dd50a192b4ddd215f92f9af4fe4" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.977086 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:47 crc kubenswrapper[4773]: I0120 18:49:47.980883 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data" (OuterVolumeSpecName: "config-data") pod "25154706-fb3d-45e9-b041-a925b21cf99e" (UID: "25154706-fb3d-45e9-b041-a925b21cf99e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.029289 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.029327 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.029338 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/25154706-fb3d-45e9-b041-a925b21cf99e-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.029346 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.029355 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t458l\" (UniqueName: \"kubernetes.io/projected/25154706-fb3d-45e9-b041-a925b21cf99e-kube-api-access-t458l\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.029364 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/25154706-fb3d-45e9-b041-a925b21cf99e-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.271801 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.286009 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.294954 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:48 crc kubenswrapper[4773]: E0120 18:49:48.295361 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api-log" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.295378 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api-log" Jan 20 18:49:48 crc kubenswrapper[4773]: E0120 18:49:48.295402 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.295410 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.295584 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api-log" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.295603 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" containerName="cinder-api" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.296478 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.300619 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.300843 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.303883 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.309559 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333351 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-logs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333422 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpw6k\" (UniqueName: \"kubernetes.io/projected/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-kube-api-access-zpw6k\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333447 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333614 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-scripts\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333710 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333870 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333892 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.333973 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-config-data\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.334066 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436097 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436140 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436171 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-config-data\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436224 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-etc-machine-id\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436249 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-logs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436791 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-logs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436802 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpw6k\" (UniqueName: \"kubernetes.io/projected/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-kube-api-access-zpw6k\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.436993 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.437077 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-scripts\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.437111 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.440820 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.441753 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.441911 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-public-tls-certs\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.443053 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-scripts\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.443666 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-config-data-custom\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.443966 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-config-data\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.452326 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpw6k\" (UniqueName: \"kubernetes.io/projected/d4d69bee-fde2-4fb6-95f6-74e35b8d5db5-kube-api-access-zpw6k\") pod \"cinder-api-0\" (UID: \"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5\") " pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.656843 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Jan 20 18:49:48 crc kubenswrapper[4773]: I0120 18:49:48.677855 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:48.800894 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:48.949968 4773 generic.go:334] "Generic (PLEG): container finished" podID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerID="2d5d679feaca700190bea6a965fb83e9b9e97a2c3cc8d0cf71e39aeec304eb27" exitCode=137 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:48.950001 4773 generic.go:334] "Generic (PLEG): container finished" podID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerID="25c305e514555ed0d81791fef19992b8b93de8b3633a72153010568146bc67e2" exitCode=137 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:48.950037 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-586bf65fdf-tqctk" event={"ID":"69d10de9-a03e-4020-8219-25cb3d9520a5","Type":"ContainerDied","Data":"2d5d679feaca700190bea6a965fb83e9b9e97a2c3cc8d0cf71e39aeec304eb27"} Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:48.950060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-586bf65fdf-tqctk" event={"ID":"69d10de9-a03e-4020-8219-25cb3d9520a5","Type":"ContainerDied","Data":"25c305e514555ed0d81791fef19992b8b93de8b3633a72153010568146bc67e2"} Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:49.457414 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25154706-fb3d-45e9-b041-a925b21cf99e" path="/var/lib/kubelet/pods/25154706-fb3d-45e9-b041-a925b21cf99e/volumes" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.340310 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.433578 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.529270 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-68fb89f56b-287lx" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.563687 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.588956 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9b66d8476-cqhrd"] Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.620782 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.633672 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.684470 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-vd86k"] Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.684779 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="dnsmasq-dns" containerID="cri-o://9d2dd544dda815f8509ae7c395de9245112061e0bafdb50324cd6086b476d3dc" gracePeriod=10 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.970167 4773 generic.go:334] "Generic (PLEG): container finished" podID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerID="9d2dd544dda815f8509ae7c395de9245112061e0bafdb50324cd6086b476d3dc" exitCode=0 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.970242 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" event={"ID":"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6","Type":"ContainerDied","Data":"9d2dd544dda815f8509ae7c395de9245112061e0bafdb50324cd6086b476d3dc"} Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.970562 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9b66d8476-cqhrd" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon-log" containerID="cri-o://7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909" gracePeriod=30 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:50.970582 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-9b66d8476-cqhrd" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" containerID="cri-o://aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4" gracePeriod=30 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.002246 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.027465 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.920627 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.927156 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.985495 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerStarted","Data":"cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1"} Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.986901 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.988980 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-586bf65fdf-tqctk" event={"ID":"69d10de9-a03e-4020-8219-25cb3d9520a5","Type":"ContainerDied","Data":"0b60f30c0a7ab6c9d2a1ba89c628d4ad9f1438793f4c8b9c55bf5b64d977ae2b"} Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.989030 4773 scope.go:117] "RemoveContainer" containerID="2d5d679feaca700190bea6a965fb83e9b9e97a2c3cc8d0cf71e39aeec304eb27" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.989131 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-586bf65fdf-tqctk" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.992344 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="cinder-scheduler" containerID="cri-o://68c83c6be4036bcce359a81115c68f082df5c191d32d4f4fd5fd81b5476cc0b7" gracePeriod=30 Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.992670 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.993012 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" event={"ID":"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6","Type":"ContainerDied","Data":"de64a54b4415566f010f514c19c1ed77dea98963569b16a8bef020db9b593d9c"} Jan 20 18:49:51 crc kubenswrapper[4773]: I0120 18:49:51.993072 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="probe" containerID="cri-o://2547f7957a36ac52b73e0306b0c716ec107a69cbf1dfb1609ce1a97df9bb6d4b" gracePeriod=30 Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.011814 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=3.269670754 podStartE2EDuration="11.011795053s" podCreationTimestamp="2026-01-20 18:49:41 +0000 UTC" firstStartedPulling="2026-01-20 18:49:43.618371412 +0000 UTC m=+1176.540184436" lastFinishedPulling="2026-01-20 18:49:51.360495711 +0000 UTC m=+1184.282308735" observedRunningTime="2026-01-20 18:49:52.009662943 +0000 UTC m=+1184.931475987" watchObservedRunningTime="2026-01-20 18:49:52.011795053 +0000 UTC m=+1184.933608077" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.021762 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/69d10de9-a03e-4020-8219-25cb3d9520a5-horizon-secret-key\") pod \"69d10de9-a03e-4020-8219-25cb3d9520a5\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.021884 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-nb\") pod \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022104 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dpcp4\" (UniqueName: \"kubernetes.io/projected/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-kube-api-access-dpcp4\") pod \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022189 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-sb\") pod \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022270 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-config\") pod \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022298 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-config-data\") pod \"69d10de9-a03e-4020-8219-25cb3d9520a5\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022353 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d10de9-a03e-4020-8219-25cb3d9520a5-logs\") pod \"69d10de9-a03e-4020-8219-25cb3d9520a5\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022411 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-dns-svc\") pod \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\" (UID: \"fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022488 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z7plc\" (UniqueName: \"kubernetes.io/projected/69d10de9-a03e-4020-8219-25cb3d9520a5-kube-api-access-z7plc\") pod \"69d10de9-a03e-4020-8219-25cb3d9520a5\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.022514 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-scripts\") pod \"69d10de9-a03e-4020-8219-25cb3d9520a5\" (UID: \"69d10de9-a03e-4020-8219-25cb3d9520a5\") " Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.023634 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/69d10de9-a03e-4020-8219-25cb3d9520a5-logs" (OuterVolumeSpecName: "logs") pod "69d10de9-a03e-4020-8219-25cb3d9520a5" (UID: "69d10de9-a03e-4020-8219-25cb3d9520a5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.037101 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/69d10de9-a03e-4020-8219-25cb3d9520a5-kube-api-access-z7plc" (OuterVolumeSpecName: "kube-api-access-z7plc") pod "69d10de9-a03e-4020-8219-25cb3d9520a5" (UID: "69d10de9-a03e-4020-8219-25cb3d9520a5"). InnerVolumeSpecName "kube-api-access-z7plc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.041753 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.042520 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-kube-api-access-dpcp4" (OuterVolumeSpecName: "kube-api-access-dpcp4") pod "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" (UID: "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6"). InnerVolumeSpecName "kube-api-access-dpcp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.042903 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/69d10de9-a03e-4020-8219-25cb3d9520a5-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "69d10de9-a03e-4020-8219-25cb3d9520a5" (UID: "69d10de9-a03e-4020-8219-25cb3d9520a5"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.107562 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-scripts" (OuterVolumeSpecName: "scripts") pod "69d10de9-a03e-4020-8219-25cb3d9520a5" (UID: "69d10de9-a03e-4020-8219-25cb3d9520a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.112630 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-config-data" (OuterVolumeSpecName: "config-data") pod "69d10de9-a03e-4020-8219-25cb3d9520a5" (UID: "69d10de9-a03e-4020-8219-25cb3d9520a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.147863 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.147901 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/69d10de9-a03e-4020-8219-25cb3d9520a5-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.147911 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z7plc\" (UniqueName: \"kubernetes.io/projected/69d10de9-a03e-4020-8219-25cb3d9520a5-kube-api-access-z7plc\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.147921 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/69d10de9-a03e-4020-8219-25cb3d9520a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.147944 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/69d10de9-a03e-4020-8219-25cb3d9520a5-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.147953 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dpcp4\" (UniqueName: \"kubernetes.io/projected/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-kube-api-access-dpcp4\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.150961 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" (UID: "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.182603 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-config" (OuterVolumeSpecName: "config") pod "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" (UID: "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.182620 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" (UID: "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.185351 4773 scope.go:117] "RemoveContainer" containerID="25c305e514555ed0d81791fef19992b8b93de8b3633a72153010568146bc67e2" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.185899 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" (UID: "fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.207662 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.212918 4773 scope.go:117] "RemoveContainer" containerID="9d2dd544dda815f8509ae7c395de9245112061e0bafdb50324cd6086b476d3dc" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.250198 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.250235 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.250248 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.250258 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.265109 4773 scope.go:117] "RemoveContainer" containerID="37609d6f106c18914801b3d94dcdadbe56b2aa2be4306121dfd04a92610b45bb" Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.336468 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-586bf65fdf-tqctk"] Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.347533 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-586bf65fdf-tqctk"] Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.355768 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-vd86k"] Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.370099 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7b946d459c-vd86k"] Jan 20 18:49:52 crc kubenswrapper[4773]: I0120 18:49:52.894296 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.012731 4773 generic.go:334] "Generic (PLEG): container finished" podID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerID="2547f7957a36ac52b73e0306b0c716ec107a69cbf1dfb1609ce1a97df9bb6d4b" exitCode=0 Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.012848 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfe8dc2b-eac6-4606-9b32-848e3a273eef","Type":"ContainerDied","Data":"2547f7957a36ac52b73e0306b0c716ec107a69cbf1dfb1609ce1a97df9bb6d4b"} Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.022975 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5","Type":"ContainerStarted","Data":"6ca3fa65c2bdb7d05f881279c32ca3a29ca5b5106b54c1b47b632b5dd3f883a7"} Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.472629 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" path="/var/lib/kubelet/pods/69d10de9-a03e-4020-8219-25cb3d9520a5/volumes" Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.473607 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" path="/var/lib/kubelet/pods/fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6/volumes" Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.894101 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-76cffc5d9-m6wn7" Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.957899 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-86d844bb6-6q8ms"] Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.958122 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-86d844bb6-6q8ms" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-api" containerID="cri-o://e48d8470ac4794d65c703528edfbd96a6097216611310b92998ca2d3d1878dfe" gracePeriod=30 Jan 20 18:49:53 crc kubenswrapper[4773]: I0120 18:49:53.958209 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-86d844bb6-6q8ms" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-httpd" containerID="cri-o://60603e41f66c2ee2986bc21b5a42dcb1715cf77af700eb0e4234d5cc16918f16" gracePeriod=30 Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.048190 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5","Type":"ContainerStarted","Data":"0fbaaf3aba900c46337cdd72555a99aceefc82267b577ce70b1b81d2fc4b22c0"} Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.048229 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"d4d69bee-fde2-4fb6-95f6-74e35b8d5db5","Type":"ContainerStarted","Data":"830274cb68640abc5cf8a072cf14bb3cb42b8a184d951d9f70e649ddc6f24086"} Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.048251 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.089268 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=6.089248844 podStartE2EDuration="6.089248844s" podCreationTimestamp="2026-01-20 18:49:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:49:54.074072491 +0000 UTC m=+1186.995885515" watchObservedRunningTime="2026-01-20 18:49:54.089248844 +0000 UTC m=+1187.011061868" Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.574226 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-7574cb8f94-wwkgd" Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.625396 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d8b6458bb-fc8lw"] Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.625614 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d8b6458bb-fc8lw" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api-log" containerID="cri-o://1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5" gracePeriod=30 Jan 20 18:49:54 crc kubenswrapper[4773]: I0120 18:49:54.625715 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-5d8b6458bb-fc8lw" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api" containerID="cri-o://165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645" gracePeriod=30 Jan 20 18:49:55 crc kubenswrapper[4773]: I0120 18:49:55.057186 4773 generic.go:334] "Generic (PLEG): container finished" podID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerID="60603e41f66c2ee2986bc21b5a42dcb1715cf77af700eb0e4234d5cc16918f16" exitCode=0 Jan 20 18:49:55 crc kubenswrapper[4773]: I0120 18:49:55.057250 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d844bb6-6q8ms" event={"ID":"a81115d7-0fb0-4319-9705-0fae198ad70b","Type":"ContainerDied","Data":"60603e41f66c2ee2986bc21b5a42dcb1715cf77af700eb0e4234d5cc16918f16"} Jan 20 18:49:55 crc kubenswrapper[4773]: I0120 18:49:55.059081 4773 generic.go:334] "Generic (PLEG): container finished" podID="49df8cea-026f-497b-baae-a6a09452aa3d" containerID="aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4" exitCode=0 Jan 20 18:49:55 crc kubenswrapper[4773]: I0120 18:49:55.059135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b66d8476-cqhrd" event={"ID":"49df8cea-026f-497b-baae-a6a09452aa3d","Type":"ContainerDied","Data":"aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4"} Jan 20 18:49:55 crc kubenswrapper[4773]: I0120 18:49:55.060501 4773 generic.go:334] "Generic (PLEG): container finished" podID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerID="1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5" exitCode=143 Jan 20 18:49:55 crc kubenswrapper[4773]: I0120 18:49:55.060557 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8b6458bb-fc8lw" event={"ID":"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2","Type":"ContainerDied","Data":"1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5"} Jan 20 18:49:56 crc kubenswrapper[4773]: I0120 18:49:56.567252 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9b66d8476-cqhrd" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Jan 20 18:49:56 crc kubenswrapper[4773]: I0120 18:49:56.879052 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7b946d459c-vd86k" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.146:5353: i/o timeout" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.083244 4773 generic.go:334] "Generic (PLEG): container finished" podID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerID="68c83c6be4036bcce359a81115c68f082df5c191d32d4f4fd5fd81b5476cc0b7" exitCode=0 Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.083300 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfe8dc2b-eac6-4606-9b32-848e3a273eef","Type":"ContainerDied","Data":"68c83c6be4036bcce359a81115c68f082df5c191d32d4f4fd5fd81b5476cc0b7"} Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.264086 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.352527 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data-custom\") pod \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.352649 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-combined-ca-bundle\") pod \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.352673 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfe8dc2b-eac6-4606-9b32-848e3a273eef-etc-machine-id\") pod \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.352740 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qbtjz\" (UniqueName: \"kubernetes.io/projected/dfe8dc2b-eac6-4606-9b32-848e3a273eef-kube-api-access-qbtjz\") pod \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.352761 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data\") pod \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.352789 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-scripts\") pod \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\" (UID: \"dfe8dc2b-eac6-4606-9b32-848e3a273eef\") " Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.353078 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/dfe8dc2b-eac6-4606-9b32-848e3a273eef-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "dfe8dc2b-eac6-4606-9b32-848e3a273eef" (UID: "dfe8dc2b-eac6-4606-9b32-848e3a273eef"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.354041 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/dfe8dc2b-eac6-4606-9b32-848e3a273eef-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.358954 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-scripts" (OuterVolumeSpecName: "scripts") pod "dfe8dc2b-eac6-4606-9b32-848e3a273eef" (UID: "dfe8dc2b-eac6-4606-9b32-848e3a273eef"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.359445 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "dfe8dc2b-eac6-4606-9b32-848e3a273eef" (UID: "dfe8dc2b-eac6-4606-9b32-848e3a273eef"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.361115 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dfe8dc2b-eac6-4606-9b32-848e3a273eef-kube-api-access-qbtjz" (OuterVolumeSpecName: "kube-api-access-qbtjz") pod "dfe8dc2b-eac6-4606-9b32-848e3a273eef" (UID: "dfe8dc2b-eac6-4606-9b32-848e3a273eef"). InnerVolumeSpecName "kube-api-access-qbtjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.428064 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dfe8dc2b-eac6-4606-9b32-848e3a273eef" (UID: "dfe8dc2b-eac6-4606-9b32-848e3a273eef"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.455442 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.455579 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.455589 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qbtjz\" (UniqueName: \"kubernetes.io/projected/dfe8dc2b-eac6-4606-9b32-848e3a273eef-kube-api-access-qbtjz\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.455599 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.464055 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data" (OuterVolumeSpecName: "config-data") pod "dfe8dc2b-eac6-4606-9b32-848e3a273eef" (UID: "dfe8dc2b-eac6-4606-9b32-848e3a273eef"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:57 crc kubenswrapper[4773]: I0120 18:49:57.557754 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dfe8dc2b-eac6-4606-9b32-848e3a273eef-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.086086 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.145427 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"dfe8dc2b-eac6-4606-9b32-848e3a273eef","Type":"ContainerDied","Data":"b33bd0d0bb49059043d7ae03c783cf24e29ed3294a97bf332d9d93097e591133"} Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.145488 4773 scope.go:117] "RemoveContainer" containerID="2547f7957a36ac52b73e0306b0c716ec107a69cbf1dfb1609ce1a97df9bb6d4b" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.145729 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.157362 4773 generic.go:334] "Generic (PLEG): container finished" podID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerID="165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645" exitCode=0 Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.157407 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8b6458bb-fc8lw" event={"ID":"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2","Type":"ContainerDied","Data":"165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645"} Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.157440 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-5d8b6458bb-fc8lw" event={"ID":"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2","Type":"ContainerDied","Data":"1ec65c4d727460ec393332a2c46d1ecf2c32fcbc331c29099e6ae31f70e9f10d"} Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.157597 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-5d8b6458bb-fc8lw" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.166877 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data-custom\") pod \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.166945 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data\") pod \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.167046 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-logs\") pod \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.167077 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-combined-ca-bundle\") pod \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.167122 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8bj7\" (UniqueName: \"kubernetes.io/projected/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-kube-api-access-h8bj7\") pod \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\" (UID: \"485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2\") " Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.168754 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-logs" (OuterVolumeSpecName: "logs") pod "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" (UID: "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.171199 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.171239 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.171276 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.171918 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f170dc7a6e6cd01e0186e6b45c72b1bd89b3220f96cf1e35088901106c87b344"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.171999 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://f170dc7a6e6cd01e0186e6b45c72b1bd89b3220f96cf1e35088901106c87b344" gracePeriod=600 Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.179381 4773 scope.go:117] "RemoveContainer" containerID="68c83c6be4036bcce359a81115c68f082df5c191d32d4f4fd5fd81b5476cc0b7" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.183024 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.188081 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" (UID: "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.189909 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.191327 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-kube-api-access-h8bj7" (OuterVolumeSpecName: "kube-api-access-h8bj7") pod "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" (UID: "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2"). InnerVolumeSpecName "kube-api-access-h8bj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.207947 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208305 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208316 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208329 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208336 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208345 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="dnsmasq-dns" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208351 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="dnsmasq-dns" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208359 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="init" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208365 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="init" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208377 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="cinder-scheduler" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208382 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="cinder-scheduler" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208399 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon-log" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208405 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon-log" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208417 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="probe" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208423 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="probe" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.208432 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api-log" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208438 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api-log" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208586 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb09b3e7-0fbe-4a65-b8a9-99ca1070a5f6" containerName="dnsmasq-dns" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208597 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208613 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" containerName="barbican-api-log" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208625 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon-log" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208634 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="cinder-scheduler" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208642 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="69d10de9-a03e-4020-8219-25cb3d9520a5" containerName="horizon" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.208649 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" containerName="probe" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.209522 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.212089 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.235109 4773 scope.go:117] "RemoveContainer" containerID="165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.248386 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.249053 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data" (OuterVolumeSpecName: "config-data") pod "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" (UID: "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.254040 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" (UID: "485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.274096 4773 scope.go:117] "RemoveContainer" containerID="1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.278818 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3e6b840-22c8-4add-b022-1ba197ca588c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.278877 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.278918 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.278975 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279027 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279055 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-579s7\" (UniqueName: \"kubernetes.io/projected/e3e6b840-22c8-4add-b022-1ba197ca588c-kube-api-access-579s7\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279136 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279147 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279156 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279165 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.279232 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8bj7\" (UniqueName: \"kubernetes.io/projected/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2-kube-api-access-h8bj7\") on node \"crc\" DevicePath \"\"" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.313658 4773 scope.go:117] "RemoveContainer" containerID="165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.316284 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645\": container with ID starting with 165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645 not found: ID does not exist" containerID="165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.316315 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645"} err="failed to get container status \"165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645\": rpc error: code = NotFound desc = could not find container \"165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645\": container with ID starting with 165fff499b13269da6c835d3382e1e806f1797529e27322b787f786ebed87645 not found: ID does not exist" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.316334 4773 scope.go:117] "RemoveContainer" containerID="1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5" Jan 20 18:49:58 crc kubenswrapper[4773]: E0120 18:49:58.316628 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5\": container with ID starting with 1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5 not found: ID does not exist" containerID="1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.316650 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5"} err="failed to get container status \"1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5\": rpc error: code = NotFound desc = could not find container \"1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5\": container with ID starting with 1b3005990fb52a6d7e54c28e9d103e42d1709f72de49da9ca04b2c6d3bc968c5 not found: ID does not exist" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.380996 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.381091 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.381135 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.381176 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.381199 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-579s7\" (UniqueName: \"kubernetes.io/projected/e3e6b840-22c8-4add-b022-1ba197ca588c-kube-api-access-579s7\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.381248 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3e6b840-22c8-4add-b022-1ba197ca588c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.381510 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e3e6b840-22c8-4add-b022-1ba197ca588c-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.387920 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.388405 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.389211 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-config-data\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.392366 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e3e6b840-22c8-4add-b022-1ba197ca588c-scripts\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.399846 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-579s7\" (UniqueName: \"kubernetes.io/projected/e3e6b840-22c8-4add-b022-1ba197ca588c-kube-api-access-579s7\") pod \"cinder-scheduler-0\" (UID: \"e3e6b840-22c8-4add-b022-1ba197ca588c\") " pod="openstack/cinder-scheduler-0" Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.524109 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-5d8b6458bb-fc8lw"] Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.530917 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-5d8b6458bb-fc8lw"] Jan 20 18:49:58 crc kubenswrapper[4773]: I0120 18:49:58.565164 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.055850 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.181091 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.196291 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="f170dc7a6e6cd01e0186e6b45c72b1bd89b3220f96cf1e35088901106c87b344" exitCode=0 Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.196369 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"f170dc7a6e6cd01e0186e6b45c72b1bd89b3220f96cf1e35088901106c87b344"} Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.196395 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"14437c4854d46fdc109569c8299e656b31c4aa4992133183b5fa2d3fd5cee7bb"} Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.196411 4773 scope.go:117] "RemoveContainer" containerID="89086664f3aacadd154bb5a0e821ec93e674c41f0d2d3c8a5f423e5e3f0c2f57" Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.204301 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3e6b840-22c8-4add-b022-1ba197ca588c","Type":"ContainerStarted","Data":"ae6d3e6d193bf50920cd851e6b7d9f5fc1799d295f41e9ca412ec695742c3a76"} Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.250351 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-668885694d-2br7g" Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.459693 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2" path="/var/lib/kubelet/pods/485c9b9a-7286-46fb-bfa6-d2ee45f9d3e2/volumes" Jan 20 18:49:59 crc kubenswrapper[4773]: I0120 18:49:59.460381 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dfe8dc2b-eac6-4606-9b32-848e3a273eef" path="/var/lib/kubelet/pods/dfe8dc2b-eac6-4606-9b32-848e3a273eef/volumes" Jan 20 18:50:00 crc kubenswrapper[4773]: I0120 18:50:00.000924 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-5bdd8cdbd7-xhf92" Jan 20 18:50:00 crc kubenswrapper[4773]: I0120 18:50:00.238090 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3e6b840-22c8-4add-b022-1ba197ca588c","Type":"ContainerStarted","Data":"ac0034e4bd44890d2711e86b526b656e18e87e3e4243bd6d835e8690bc7448b2"} Jan 20 18:50:00 crc kubenswrapper[4773]: I0120 18:50:00.844955 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.116711 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.117773 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.120978 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.123470 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-kxfsr" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.123532 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.129345 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.238216 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-openstack-config\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.238289 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.238329 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lgft\" (UniqueName: \"kubernetes.io/projected/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-kube-api-access-7lgft\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.238495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.255463 4773 generic.go:334] "Generic (PLEG): container finished" podID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerID="e48d8470ac4794d65c703528edfbd96a6097216611310b92998ca2d3d1878dfe" exitCode=0 Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.255531 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d844bb6-6q8ms" event={"ID":"a81115d7-0fb0-4319-9705-0fae198ad70b","Type":"ContainerDied","Data":"e48d8470ac4794d65c703528edfbd96a6097216611310b92998ca2d3d1878dfe"} Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.261251 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e3e6b840-22c8-4add-b022-1ba197ca588c","Type":"ContainerStarted","Data":"f8df994e3e0d9cc59d1919154bacdb40970a242eae5691909d9c3a996738eb17"} Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.340835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.341064 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-openstack-config\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.341106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.341140 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7lgft\" (UniqueName: \"kubernetes.io/projected/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-kube-api-access-7lgft\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.342657 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-openstack-config\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.352460 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-openstack-config-secret\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.363483 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-combined-ca-bundle\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.364177 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lgft\" (UniqueName: \"kubernetes.io/projected/f040c75f-a2cb-4bfe-9fd1-0105887fa6b4-kube-api-access-7lgft\") pod \"openstackclient\" (UID: \"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4\") " pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.438532 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.574348 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.604114 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.6040921089999998 podStartE2EDuration="3.604092109s" podCreationTimestamp="2026-01-20 18:49:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:50:01.284674282 +0000 UTC m=+1194.206487316" watchObservedRunningTime="2026-01-20 18:50:01.604092109 +0000 UTC m=+1194.525905133" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.647505 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-combined-ca-bundle\") pod \"a81115d7-0fb0-4319-9705-0fae198ad70b\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.647577 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-httpd-config\") pod \"a81115d7-0fb0-4319-9705-0fae198ad70b\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.647671 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2hnc\" (UniqueName: \"kubernetes.io/projected/a81115d7-0fb0-4319-9705-0fae198ad70b-kube-api-access-j2hnc\") pod \"a81115d7-0fb0-4319-9705-0fae198ad70b\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.647755 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-ovndb-tls-certs\") pod \"a81115d7-0fb0-4319-9705-0fae198ad70b\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.647837 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-config\") pod \"a81115d7-0fb0-4319-9705-0fae198ad70b\" (UID: \"a81115d7-0fb0-4319-9705-0fae198ad70b\") " Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.656412 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81115d7-0fb0-4319-9705-0fae198ad70b-kube-api-access-j2hnc" (OuterVolumeSpecName: "kube-api-access-j2hnc") pod "a81115d7-0fb0-4319-9705-0fae198ad70b" (UID: "a81115d7-0fb0-4319-9705-0fae198ad70b"). InnerVolumeSpecName "kube-api-access-j2hnc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.660639 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "a81115d7-0fb0-4319-9705-0fae198ad70b" (UID: "a81115d7-0fb0-4319-9705-0fae198ad70b"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.700492 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-config" (OuterVolumeSpecName: "config") pod "a81115d7-0fb0-4319-9705-0fae198ad70b" (UID: "a81115d7-0fb0-4319-9705-0fae198ad70b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.717031 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a81115d7-0fb0-4319-9705-0fae198ad70b" (UID: "a81115d7-0fb0-4319-9705-0fae198ad70b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.751752 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.751785 4773 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-httpd-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.751799 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j2hnc\" (UniqueName: \"kubernetes.io/projected/a81115d7-0fb0-4319-9705-0fae198ad70b-kube-api-access-j2hnc\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.751818 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.755543 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "a81115d7-0fb0-4319-9705-0fae198ad70b" (UID: "a81115d7-0fb0-4319-9705-0fae198ad70b"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.854372 4773 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/a81115d7-0fb0-4319-9705-0fae198ad70b-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:01 crc kubenswrapper[4773]: I0120 18:50:01.969479 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Jan 20 18:50:01 crc kubenswrapper[4773]: W0120 18:50:01.970763 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf040c75f_a2cb_4bfe_9fd1_0105887fa6b4.slice/crio-db90180ef300acd66a3ffa9cf3c8fd1b285ebb49952c4b984cafb9f681302ce2 WatchSource:0}: Error finding container db90180ef300acd66a3ffa9cf3c8fd1b285ebb49952c4b984cafb9f681302ce2: Status 404 returned error can't find the container with id db90180ef300acd66a3ffa9cf3c8fd1b285ebb49952c4b984cafb9f681302ce2 Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.270594 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4","Type":"ContainerStarted","Data":"db90180ef300acd66a3ffa9cf3c8fd1b285ebb49952c4b984cafb9f681302ce2"} Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.273131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-86d844bb6-6q8ms" event={"ID":"a81115d7-0fb0-4319-9705-0fae198ad70b","Type":"ContainerDied","Data":"4e34f5d6513de50dbefb964db35642a2f245e6de8a45b2992d0938120feea1ea"} Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.273196 4773 scope.go:117] "RemoveContainer" containerID="60603e41f66c2ee2986bc21b5a42dcb1715cf77af700eb0e4234d5cc16918f16" Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.273208 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-86d844bb6-6q8ms" Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.293762 4773 scope.go:117] "RemoveContainer" containerID="e48d8470ac4794d65c703528edfbd96a6097216611310b92998ca2d3d1878dfe" Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.306747 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-86d844bb6-6q8ms"] Jan 20 18:50:02 crc kubenswrapper[4773]: I0120 18:50:02.314081 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-86d844bb6-6q8ms"] Jan 20 18:50:03 crc kubenswrapper[4773]: I0120 18:50:03.461613 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" path="/var/lib/kubelet/pods/a81115d7-0fb0-4319-9705-0fae198ad70b/volumes" Jan 20 18:50:03 crc kubenswrapper[4773]: I0120 18:50:03.565789 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Jan 20 18:50:06 crc kubenswrapper[4773]: I0120 18:50:06.567312 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9b66d8476-cqhrd" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Jan 20 18:50:07 crc kubenswrapper[4773]: I0120 18:50:07.999134 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-vphtt"] Jan 20 18:50:08 crc kubenswrapper[4773]: E0120 18:50:07.999727 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-api" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:07.999740 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-api" Jan 20 18:50:08 crc kubenswrapper[4773]: E0120 18:50:07.999754 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-httpd" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:07.999759 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-httpd" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:07.999953 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-api" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:07.999976 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a81115d7-0fb0-4319-9705-0fae198ad70b" containerName="neutron-httpd" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.000496 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.015494 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vphtt"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.117266 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-jgwdl"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.118332 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.127899 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jgwdl"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.188984 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-operator-scripts\") pod \"nova-api-db-create-vphtt\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.189050 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kmxm\" (UniqueName: \"kubernetes.io/projected/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-kube-api-access-8kmxm\") pod \"nova-api-db-create-vphtt\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.224393 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-da16-account-create-update-nsb2n"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.225484 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.227994 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.233638 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-da16-account-create-update-nsb2n"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.297342 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-operator-scripts\") pod \"nova-cell0-db-create-jgwdl\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.297389 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7455911e-a1ad-442b-97b9-362496066bbf-operator-scripts\") pod \"nova-api-da16-account-create-update-nsb2n\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.297449 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fp7t\" (UniqueName: \"kubernetes.io/projected/7455911e-a1ad-442b-97b9-362496066bbf-kube-api-access-4fp7t\") pod \"nova-api-da16-account-create-update-nsb2n\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.297469 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42xgs\" (UniqueName: \"kubernetes.io/projected/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-kube-api-access-42xgs\") pod \"nova-cell0-db-create-jgwdl\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.297519 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-operator-scripts\") pod \"nova-api-db-create-vphtt\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.297561 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kmxm\" (UniqueName: \"kubernetes.io/projected/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-kube-api-access-8kmxm\") pod \"nova-api-db-create-vphtt\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.298789 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-operator-scripts\") pod \"nova-api-db-create-vphtt\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.349080 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-f2d2j"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.351668 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.359741 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kmxm\" (UniqueName: \"kubernetes.io/projected/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-kube-api-access-8kmxm\") pod \"nova-api-db-create-vphtt\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.372601 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-f2d2j"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.399243 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/833eac91-4269-4e1e-9923-8dd8ed2276dc-operator-scripts\") pod \"nova-cell1-db-create-f2d2j\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.399329 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clxwl\" (UniqueName: \"kubernetes.io/projected/833eac91-4269-4e1e-9923-8dd8ed2276dc-kube-api-access-clxwl\") pod \"nova-cell1-db-create-f2d2j\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.399361 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fp7t\" (UniqueName: \"kubernetes.io/projected/7455911e-a1ad-442b-97b9-362496066bbf-kube-api-access-4fp7t\") pod \"nova-api-da16-account-create-update-nsb2n\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.399388 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42xgs\" (UniqueName: \"kubernetes.io/projected/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-kube-api-access-42xgs\") pod \"nova-cell0-db-create-jgwdl\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.399551 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-operator-scripts\") pod \"nova-cell0-db-create-jgwdl\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.399578 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7455911e-a1ad-442b-97b9-362496066bbf-operator-scripts\") pod \"nova-api-da16-account-create-update-nsb2n\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.400356 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7455911e-a1ad-442b-97b9-362496066bbf-operator-scripts\") pod \"nova-api-da16-account-create-update-nsb2n\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.400919 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-operator-scripts\") pod \"nova-cell0-db-create-jgwdl\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.424391 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-79ae-account-create-update-kgx64"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.425682 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.428384 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.434219 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-79ae-account-create-update-kgx64"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.437099 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42xgs\" (UniqueName: \"kubernetes.io/projected/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-kube-api-access-42xgs\") pod \"nova-cell0-db-create-jgwdl\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.438171 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.437029 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fp7t\" (UniqueName: \"kubernetes.io/projected/7455911e-a1ad-442b-97b9-362496066bbf-kube-api-access-4fp7t\") pod \"nova-api-da16-account-create-update-nsb2n\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.501089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/833eac91-4269-4e1e-9923-8dd8ed2276dc-operator-scripts\") pod \"nova-cell1-db-create-f2d2j\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.501150 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-clxwl\" (UniqueName: \"kubernetes.io/projected/833eac91-4269-4e1e-9923-8dd8ed2276dc-kube-api-access-clxwl\") pod \"nova-cell1-db-create-f2d2j\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.501198 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd3a449-dc14-46ca-8e19-64d0a282483e-operator-scripts\") pod \"nova-cell0-79ae-account-create-update-kgx64\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.501252 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5m2hv\" (UniqueName: \"kubernetes.io/projected/2bd3a449-dc14-46ca-8e19-64d0a282483e-kube-api-access-5m2hv\") pod \"nova-cell0-79ae-account-create-update-kgx64\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.501843 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/833eac91-4269-4e1e-9923-8dd8ed2276dc-operator-scripts\") pod \"nova-cell1-db-create-f2d2j\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.521825 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-clxwl\" (UniqueName: \"kubernetes.io/projected/833eac91-4269-4e1e-9923-8dd8ed2276dc-kube-api-access-clxwl\") pod \"nova-cell1-db-create-f2d2j\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.545881 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.602661 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd3a449-dc14-46ca-8e19-64d0a282483e-operator-scripts\") pod \"nova-cell0-79ae-account-create-update-kgx64\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.602713 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5m2hv\" (UniqueName: \"kubernetes.io/projected/2bd3a449-dc14-46ca-8e19-64d0a282483e-kube-api-access-5m2hv\") pod \"nova-cell0-79ae-account-create-update-kgx64\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.608433 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd3a449-dc14-46ca-8e19-64d0a282483e-operator-scripts\") pod \"nova-cell0-79ae-account-create-update-kgx64\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.615561 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.629202 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-33c7-account-create-update-ck7lk"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.630608 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.632443 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.637439 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-33c7-account-create-update-ck7lk"] Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.637886 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5m2hv\" (UniqueName: \"kubernetes.io/projected/2bd3a449-dc14-46ca-8e19-64d0a282483e-kube-api-access-5m2hv\") pod \"nova-cell0-79ae-account-create-update-kgx64\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.704215 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8t78\" (UniqueName: \"kubernetes.io/projected/f4f47b18-303f-415d-8bf8-c1f7a075b747-kube-api-access-w8t78\") pod \"nova-cell1-33c7-account-create-update-ck7lk\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.704344 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4f47b18-303f-415d-8bf8-c1f7a075b747-operator-scripts\") pod \"nova-cell1-33c7-account-create-update-ck7lk\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.715018 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.806902 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8t78\" (UniqueName: \"kubernetes.io/projected/f4f47b18-303f-415d-8bf8-c1f7a075b747-kube-api-access-w8t78\") pod \"nova-cell1-33c7-account-create-update-ck7lk\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.807076 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4f47b18-303f-415d-8bf8-c1f7a075b747-operator-scripts\") pod \"nova-cell1-33c7-account-create-update-ck7lk\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.807186 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.807806 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4f47b18-303f-415d-8bf8-c1f7a075b747-operator-scripts\") pod \"nova-cell1-33c7-account-create-update-ck7lk\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.827733 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8t78\" (UniqueName: \"kubernetes.io/projected/f4f47b18-303f-415d-8bf8-c1f7a075b747-kube-api-access-w8t78\") pod \"nova-cell1-33c7-account-create-update-ck7lk\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.852771 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Jan 20 18:50:08 crc kubenswrapper[4773]: I0120 18:50:08.982306 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:12 crc kubenswrapper[4773]: I0120 18:50:12.209101 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 20 18:50:13 crc kubenswrapper[4773]: I0120 18:50:13.880830 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-vphtt"] Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.029830 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-jgwdl"] Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.132694 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-f2d2j"] Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.145297 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-79ae-account-create-update-kgx64"] Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.152801 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-da16-account-create-update-nsb2n"] Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.295983 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-33c7-account-create-update-ck7lk"] Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.400794 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jgwdl" event={"ID":"47dcb7c9-ffa7-46bc-b695-02aea6e679a1","Type":"ContainerStarted","Data":"2ccbf39de56ce437fa73601c164599a09e48be8a2f2534b1c75fa3d80b294c65"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.400835 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jgwdl" event={"ID":"47dcb7c9-ffa7-46bc-b695-02aea6e679a1","Type":"ContainerStarted","Data":"efbcae37c3969e8bca06ecc2eabe86724c6d9256d71d35fd4d834b35496de1dc"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.402610 4773 generic.go:334] "Generic (PLEG): container finished" podID="86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" containerID="33a44114454454a182e314a103e4daecf90ecb7caed9c7572f0056b58d9567e3" exitCode=0 Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.402689 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vphtt" event={"ID":"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7","Type":"ContainerDied","Data":"33a44114454454a182e314a103e4daecf90ecb7caed9c7572f0056b58d9567e3"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.402715 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vphtt" event={"ID":"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7","Type":"ContainerStarted","Data":"b3a20b02dca0f94d69c52d5cde14a0b17023d664d2e0f60c310212d2d641b2e3"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.405127 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"f040c75f-a2cb-4bfe-9fd1-0105887fa6b4","Type":"ContainerStarted","Data":"2c66796b42027cddb54ebf14b42a3ad401508203b482b5599165d709fb2ff2f1"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.407777 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-da16-account-create-update-nsb2n" event={"ID":"7455911e-a1ad-442b-97b9-362496066bbf","Type":"ContainerStarted","Data":"ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.407813 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-da16-account-create-update-nsb2n" event={"ID":"7455911e-a1ad-442b-97b9-362496066bbf","Type":"ContainerStarted","Data":"7ac71ed6da12d2a976b878b9fa0faeca3a2bd030069ee0d9b4520b6b973abc08"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.410131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" event={"ID":"2bd3a449-dc14-46ca-8e19-64d0a282483e","Type":"ContainerStarted","Data":"9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.410300 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" event={"ID":"2bd3a449-dc14-46ca-8e19-64d0a282483e","Type":"ContainerStarted","Data":"17f0c29c50eec97b536a2eb64b28c40b7df6c9fba7563133e287c40d3178ec5b"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.413184 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" event={"ID":"f4f47b18-303f-415d-8bf8-c1f7a075b747","Type":"ContainerStarted","Data":"9543425a6b98ecb5cb8e49dc0331eeeff3eaa05df6743d681a9e2302a8272c7c"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.415125 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f2d2j" event={"ID":"833eac91-4269-4e1e-9923-8dd8ed2276dc","Type":"ContainerStarted","Data":"7358a078ad55ead91e8865f0bf5d80f239dd741b49799df9b4c64f0d2143c92a"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.415166 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f2d2j" event={"ID":"833eac91-4269-4e1e-9923-8dd8ed2276dc","Type":"ContainerStarted","Data":"b755badae46a08f9cf38c4037c21c34427dd419272bb54ab07c468cc7dd674cf"} Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.426495 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-db-create-jgwdl" podStartSLOduration=6.426477374 podStartE2EDuration="6.426477374s" podCreationTimestamp="2026-01-20 18:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:50:14.419261902 +0000 UTC m=+1207.341074926" watchObservedRunningTime="2026-01-20 18:50:14.426477374 +0000 UTC m=+1207.348290398" Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.457219 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-f2d2j" podStartSLOduration=6.457198738 podStartE2EDuration="6.457198738s" podCreationTimestamp="2026-01-20 18:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:50:14.450192771 +0000 UTC m=+1207.372005795" watchObservedRunningTime="2026-01-20 18:50:14.457198738 +0000 UTC m=+1207.379011762" Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.471886 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=1.864086634 podStartE2EDuration="13.47186401s" podCreationTimestamp="2026-01-20 18:50:01 +0000 UTC" firstStartedPulling="2026-01-20 18:50:01.973465391 +0000 UTC m=+1194.895278425" lastFinishedPulling="2026-01-20 18:50:13.581242777 +0000 UTC m=+1206.503055801" observedRunningTime="2026-01-20 18:50:14.466067501 +0000 UTC m=+1207.387880525" watchObservedRunningTime="2026-01-20 18:50:14.47186401 +0000 UTC m=+1207.393677034" Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.493749 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" podStartSLOduration=6.493732592 podStartE2EDuration="6.493732592s" podCreationTimestamp="2026-01-20 18:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:50:14.4836012 +0000 UTC m=+1207.405414234" watchObservedRunningTime="2026-01-20 18:50:14.493732592 +0000 UTC m=+1207.415545616" Jan 20 18:50:14 crc kubenswrapper[4773]: I0120 18:50:14.516499 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-da16-account-create-update-nsb2n" podStartSLOduration=6.516480127 podStartE2EDuration="6.516480127s" podCreationTimestamp="2026-01-20 18:50:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:50:14.506168399 +0000 UTC m=+1207.427981443" watchObservedRunningTime="2026-01-20 18:50:14.516480127 +0000 UTC m=+1207.438293151" Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.426467 4773 generic.go:334] "Generic (PLEG): container finished" podID="7455911e-a1ad-442b-97b9-362496066bbf" containerID="ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3" exitCode=0 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.426571 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-da16-account-create-update-nsb2n" event={"ID":"7455911e-a1ad-442b-97b9-362496066bbf","Type":"ContainerDied","Data":"ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3"} Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.437342 4773 generic.go:334] "Generic (PLEG): container finished" podID="2bd3a449-dc14-46ca-8e19-64d0a282483e" containerID="9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3" exitCode=0 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.437407 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" event={"ID":"2bd3a449-dc14-46ca-8e19-64d0a282483e","Type":"ContainerDied","Data":"9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3"} Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.439587 4773 generic.go:334] "Generic (PLEG): container finished" podID="f4f47b18-303f-415d-8bf8-c1f7a075b747" containerID="6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2" exitCode=0 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.439626 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" event={"ID":"f4f47b18-303f-415d-8bf8-c1f7a075b747","Type":"ContainerDied","Data":"6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2"} Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.441522 4773 generic.go:334] "Generic (PLEG): container finished" podID="833eac91-4269-4e1e-9923-8dd8ed2276dc" containerID="7358a078ad55ead91e8865f0bf5d80f239dd741b49799df9b4c64f0d2143c92a" exitCode=0 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.441561 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f2d2j" event={"ID":"833eac91-4269-4e1e-9923-8dd8ed2276dc","Type":"ContainerDied","Data":"7358a078ad55ead91e8865f0bf5d80f239dd741b49799df9b4c64f0d2143c92a"} Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.451335 4773 generic.go:334] "Generic (PLEG): container finished" podID="47dcb7c9-ffa7-46bc-b695-02aea6e679a1" containerID="2ccbf39de56ce437fa73601c164599a09e48be8a2f2534b1c75fa3d80b294c65" exitCode=0 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.464080 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jgwdl" event={"ID":"47dcb7c9-ffa7-46bc-b695-02aea6e679a1","Type":"ContainerDied","Data":"2ccbf39de56ce437fa73601c164599a09e48be8a2f2534b1c75fa3d80b294c65"} Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.553381 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.553690 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-central-agent" containerID="cri-o://4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77" gracePeriod=30 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.554227 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="proxy-httpd" containerID="cri-o://cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1" gracePeriod=30 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.554332 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="sg-core" containerID="cri-o://2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862" gracePeriod=30 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.554402 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-notification-agent" containerID="cri-o://7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e" gracePeriod=30 Jan 20 18:50:15 crc kubenswrapper[4773]: I0120 18:50:15.841095 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.039490 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kmxm\" (UniqueName: \"kubernetes.io/projected/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-kube-api-access-8kmxm\") pod \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.039570 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-operator-scripts\") pod \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\" (UID: \"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7\") " Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.040726 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" (UID: "86ae6a8c-2043-4e0f-a23f-43c998d3d9d7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.045046 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-kube-api-access-8kmxm" (OuterVolumeSpecName: "kube-api-access-8kmxm") pod "86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" (UID: "86ae6a8c-2043-4e0f-a23f-43c998d3d9d7"). InnerVolumeSpecName "kube-api-access-8kmxm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.142645 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8kmxm\" (UniqueName: \"kubernetes.io/projected/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-kube-api-access-8kmxm\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.142683 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.462900 4773 generic.go:334] "Generic (PLEG): container finished" podID="f5814cea-a704-4de4-9205-d65cde58c777" containerID="cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1" exitCode=0 Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.463178 4773 generic.go:334] "Generic (PLEG): container finished" podID="f5814cea-a704-4de4-9205-d65cde58c777" containerID="2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862" exitCode=2 Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.463186 4773 generic.go:334] "Generic (PLEG): container finished" podID="f5814cea-a704-4de4-9205-d65cde58c777" containerID="4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77" exitCode=0 Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.463221 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerDied","Data":"cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1"} Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.463263 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerDied","Data":"2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862"} Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.463275 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerDied","Data":"4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77"} Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.465187 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-vphtt" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.466060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-vphtt" event={"ID":"86ae6a8c-2043-4e0f-a23f-43c998d3d9d7","Type":"ContainerDied","Data":"b3a20b02dca0f94d69c52d5cde14a0b17023d664d2e0f60c310212d2d641b2e3"} Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.466119 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3a20b02dca0f94d69c52d5cde14a0b17023d664d2e0f60c310212d2d641b2e3" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.567747 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-9b66d8476-cqhrd" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.143:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.143:8443: connect: connection refused" Jan 20 18:50:16 crc kubenswrapper[4773]: I0120 18:50:16.567907 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.103860 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.272609 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/833eac91-4269-4e1e-9923-8dd8ed2276dc-operator-scripts\") pod \"833eac91-4269-4e1e-9923-8dd8ed2276dc\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.272901 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-clxwl\" (UniqueName: \"kubernetes.io/projected/833eac91-4269-4e1e-9923-8dd8ed2276dc-kube-api-access-clxwl\") pod \"833eac91-4269-4e1e-9923-8dd8ed2276dc\" (UID: \"833eac91-4269-4e1e-9923-8dd8ed2276dc\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.273498 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/833eac91-4269-4e1e-9923-8dd8ed2276dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "833eac91-4269-4e1e-9923-8dd8ed2276dc" (UID: "833eac91-4269-4e1e-9923-8dd8ed2276dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.274487 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/833eac91-4269-4e1e-9923-8dd8ed2276dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.285997 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/833eac91-4269-4e1e-9923-8dd8ed2276dc-kube-api-access-clxwl" (OuterVolumeSpecName: "kube-api-access-clxwl") pod "833eac91-4269-4e1e-9923-8dd8ed2276dc" (UID: "833eac91-4269-4e1e-9923-8dd8ed2276dc"). InnerVolumeSpecName "kube-api-access-clxwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.316895 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.325036 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.330306 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.376033 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-clxwl\" (UniqueName: \"kubernetes.io/projected/833eac91-4269-4e1e-9923-8dd8ed2276dc-kube-api-access-clxwl\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.377084 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.476842 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5m2hv\" (UniqueName: \"kubernetes.io/projected/2bd3a449-dc14-46ca-8e19-64d0a282483e-kube-api-access-5m2hv\") pod \"2bd3a449-dc14-46ca-8e19-64d0a282483e\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.476885 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8t78\" (UniqueName: \"kubernetes.io/projected/f4f47b18-303f-415d-8bf8-c1f7a075b747-kube-api-access-w8t78\") pod \"f4f47b18-303f-415d-8bf8-c1f7a075b747\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.476946 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4f47b18-303f-415d-8bf8-c1f7a075b747-operator-scripts\") pod \"f4f47b18-303f-415d-8bf8-c1f7a075b747\" (UID: \"f4f47b18-303f-415d-8bf8-c1f7a075b747\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.477047 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd3a449-dc14-46ca-8e19-64d0a282483e-operator-scripts\") pod \"2bd3a449-dc14-46ca-8e19-64d0a282483e\" (UID: \"2bd3a449-dc14-46ca-8e19-64d0a282483e\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.477072 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42xgs\" (UniqueName: \"kubernetes.io/projected/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-kube-api-access-42xgs\") pod \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.477162 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-operator-scripts\") pod \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\" (UID: \"47dcb7c9-ffa7-46bc-b695-02aea6e679a1\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.477741 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2bd3a449-dc14-46ca-8e19-64d0a282483e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2bd3a449-dc14-46ca-8e19-64d0a282483e" (UID: "2bd3a449-dc14-46ca-8e19-64d0a282483e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.478254 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4f47b18-303f-415d-8bf8-c1f7a075b747-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f4f47b18-303f-415d-8bf8-c1f7a075b747" (UID: "f4f47b18-303f-415d-8bf8-c1f7a075b747"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.478673 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "47dcb7c9-ffa7-46bc-b695-02aea6e679a1" (UID: "47dcb7c9-ffa7-46bc-b695-02aea6e679a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.483732 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-kube-api-access-42xgs" (OuterVolumeSpecName: "kube-api-access-42xgs") pod "47dcb7c9-ffa7-46bc-b695-02aea6e679a1" (UID: "47dcb7c9-ffa7-46bc-b695-02aea6e679a1"). InnerVolumeSpecName "kube-api-access-42xgs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.490193 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4f47b18-303f-415d-8bf8-c1f7a075b747-kube-api-access-w8t78" (OuterVolumeSpecName: "kube-api-access-w8t78") pod "f4f47b18-303f-415d-8bf8-c1f7a075b747" (UID: "f4f47b18-303f-415d-8bf8-c1f7a075b747"). InnerVolumeSpecName "kube-api-access-w8t78". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.496350 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd3a449-dc14-46ca-8e19-64d0a282483e-kube-api-access-5m2hv" (OuterVolumeSpecName: "kube-api-access-5m2hv") pod "2bd3a449-dc14-46ca-8e19-64d0a282483e" (UID: "2bd3a449-dc14-46ca-8e19-64d0a282483e"). InnerVolumeSpecName "kube-api-access-5m2hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.497542 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.500529 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-79ae-account-create-update-kgx64" event={"ID":"2bd3a449-dc14-46ca-8e19-64d0a282483e","Type":"ContainerDied","Data":"17f0c29c50eec97b536a2eb64b28c40b7df6c9fba7563133e287c40d3178ec5b"} Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.500571 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17f0c29c50eec97b536a2eb64b28c40b7df6c9fba7563133e287c40d3178ec5b" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.504666 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-f2d2j" event={"ID":"833eac91-4269-4e1e-9923-8dd8ed2276dc","Type":"ContainerDied","Data":"b755badae46a08f9cf38c4037c21c34427dd419272bb54ab07c468cc7dd674cf"} Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.504705 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b755badae46a08f9cf38c4037c21c34427dd419272bb54ab07c468cc7dd674cf" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.504793 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-f2d2j" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.508201 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" event={"ID":"f4f47b18-303f-415d-8bf8-c1f7a075b747","Type":"ContainerDied","Data":"9543425a6b98ecb5cb8e49dc0331eeeff3eaa05df6743d681a9e2302a8272c7c"} Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.508315 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9543425a6b98ecb5cb8e49dc0331eeeff3eaa05df6743d681a9e2302a8272c7c" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.508430 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-33c7-account-create-update-ck7lk" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.510309 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-jgwdl" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.510487 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-jgwdl" event={"ID":"47dcb7c9-ffa7-46bc-b695-02aea6e679a1","Type":"ContainerDied","Data":"efbcae37c3969e8bca06ecc2eabe86724c6d9256d71d35fd4d834b35496de1dc"} Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.510643 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="efbcae37c3969e8bca06ecc2eabe86724c6d9256d71d35fd4d834b35496de1dc" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.514986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-da16-account-create-update-nsb2n" event={"ID":"7455911e-a1ad-442b-97b9-362496066bbf","Type":"ContainerDied","Data":"7ac71ed6da12d2a976b878b9fa0faeca3a2bd030069ee0d9b4520b6b973abc08"} Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.515055 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7ac71ed6da12d2a976b878b9fa0faeca3a2bd030069ee0d9b4520b6b973abc08" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.515180 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-da16-account-create-update-nsb2n" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.579593 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7455911e-a1ad-442b-97b9-362496066bbf-operator-scripts\") pod \"7455911e-a1ad-442b-97b9-362496066bbf\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.580028 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4fp7t\" (UniqueName: \"kubernetes.io/projected/7455911e-a1ad-442b-97b9-362496066bbf-kube-api-access-4fp7t\") pod \"7455911e-a1ad-442b-97b9-362496066bbf\" (UID: \"7455911e-a1ad-442b-97b9-362496066bbf\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.580480 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2bd3a449-dc14-46ca-8e19-64d0a282483e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.580557 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42xgs\" (UniqueName: \"kubernetes.io/projected/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-kube-api-access-42xgs\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.580620 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/47dcb7c9-ffa7-46bc-b695-02aea6e679a1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.580698 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5m2hv\" (UniqueName: \"kubernetes.io/projected/2bd3a449-dc14-46ca-8e19-64d0a282483e-kube-api-access-5m2hv\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.580895 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8t78\" (UniqueName: \"kubernetes.io/projected/f4f47b18-303f-415d-8bf8-c1f7a075b747-kube-api-access-w8t78\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.581033 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f4f47b18-303f-415d-8bf8-c1f7a075b747-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.581834 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7455911e-a1ad-442b-97b9-362496066bbf-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7455911e-a1ad-442b-97b9-362496066bbf" (UID: "7455911e-a1ad-442b-97b9-362496066bbf"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.587565 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7455911e-a1ad-442b-97b9-362496066bbf-kube-api-access-4fp7t" (OuterVolumeSpecName: "kube-api-access-4fp7t") pod "7455911e-a1ad-442b-97b9-362496066bbf" (UID: "7455911e-a1ad-442b-97b9-362496066bbf"). InnerVolumeSpecName "kube-api-access-4fp7t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.682006 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4fp7t\" (UniqueName: \"kubernetes.io/projected/7455911e-a1ad-442b-97b9-362496066bbf-kube-api-access-4fp7t\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.682349 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7455911e-a1ad-442b-97b9-362496066bbf-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.918722 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.986147 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckpjr\" (UniqueName: \"kubernetes.io/projected/f5814cea-a704-4de4-9205-d65cde58c777-kube-api-access-ckpjr\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.986212 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-run-httpd\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.986611 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.987035 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-scripts\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.987071 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-log-httpd\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.987108 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-combined-ca-bundle\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.987147 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-config-data\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.987187 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-sg-core-conf-yaml\") pod \"f5814cea-a704-4de4-9205-d65cde58c777\" (UID: \"f5814cea-a704-4de4-9205-d65cde58c777\") " Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.987924 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.988151 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.995892 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-scripts" (OuterVolumeSpecName: "scripts") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:17 crc kubenswrapper[4773]: I0120 18:50:17.996134 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f5814cea-a704-4de4-9205-d65cde58c777-kube-api-access-ckpjr" (OuterVolumeSpecName: "kube-api-access-ckpjr") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "kube-api-access-ckpjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.058486 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.099219 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.099251 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/f5814cea-a704-4de4-9205-d65cde58c777-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.099261 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.099273 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckpjr\" (UniqueName: \"kubernetes.io/projected/f5814cea-a704-4de4-9205-d65cde58c777-kube-api-access-ckpjr\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.140068 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-config-data" (OuterVolumeSpecName: "config-data") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.142773 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f5814cea-a704-4de4-9205-d65cde58c777" (UID: "f5814cea-a704-4de4-9205-d65cde58c777"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.202235 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.202277 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f5814cea-a704-4de4-9205-d65cde58c777-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.526058 4773 generic.go:334] "Generic (PLEG): container finished" podID="f5814cea-a704-4de4-9205-d65cde58c777" containerID="7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e" exitCode=0 Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.526103 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerDied","Data":"7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e"} Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.526135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"f5814cea-a704-4de4-9205-d65cde58c777","Type":"ContainerDied","Data":"88338d24506cf39cc9754f352bf93432e909359610b5305d799ad52d2dc0901c"} Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.526139 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.526156 4773 scope.go:117] "RemoveContainer" containerID="cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.557190 4773 scope.go:117] "RemoveContainer" containerID="2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.558429 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.570203 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.577845 4773 scope.go:117] "RemoveContainer" containerID="7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.584398 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.584976 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4f47b18-303f-415d-8bf8-c1f7a075b747" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.585078 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4f47b18-303f-415d-8bf8-c1f7a075b747" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.585149 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="833eac91-4269-4e1e-9923-8dd8ed2276dc" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.585210 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="833eac91-4269-4e1e-9923-8dd8ed2276dc" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.585289 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="sg-core" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.585351 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="sg-core" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.585543 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7455911e-a1ad-442b-97b9-362496066bbf" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.587383 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="7455911e-a1ad-442b-97b9-362496066bbf" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.587504 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-central-agent" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.587576 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-central-agent" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.587632 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.587691 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.587760 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="proxy-httpd" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.587813 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="proxy-httpd" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.587883 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47dcb7c9-ffa7-46bc-b695-02aea6e679a1" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.587962 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="47dcb7c9-ffa7-46bc-b695-02aea6e679a1" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.588017 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-notification-agent" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.588068 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-notification-agent" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.588119 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd3a449-dc14-46ca-8e19-64d0a282483e" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.588168 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd3a449-dc14-46ca-8e19-64d0a282483e" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.588494 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593120 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="proxy-httpd" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593213 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4f47b18-303f-415d-8bf8-c1f7a075b747" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593249 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="sg-core" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593265 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="7455911e-a1ad-442b-97b9-362496066bbf" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593282 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-central-agent" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593290 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f5814cea-a704-4de4-9205-d65cde58c777" containerName="ceilometer-notification-agent" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593321 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd3a449-dc14-46ca-8e19-64d0a282483e" containerName="mariadb-account-create-update" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593337 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="833eac91-4269-4e1e-9923-8dd8ed2276dc" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.593351 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="47dcb7c9-ffa7-46bc-b695-02aea6e679a1" containerName="mariadb-database-create" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.595895 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.597373 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.598814 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.599112 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.616700 4773 scope.go:117] "RemoveContainer" containerID="4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.645566 4773 scope.go:117] "RemoveContainer" containerID="cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.645984 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1\": container with ID starting with cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1 not found: ID does not exist" containerID="cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.646012 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1"} err="failed to get container status \"cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1\": rpc error: code = NotFound desc = could not find container \"cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1\": container with ID starting with cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1 not found: ID does not exist" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.646034 4773 scope.go:117] "RemoveContainer" containerID="2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.646277 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862\": container with ID starting with 2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862 not found: ID does not exist" containerID="2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.646299 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862"} err="failed to get container status \"2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862\": rpc error: code = NotFound desc = could not find container \"2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862\": container with ID starting with 2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862 not found: ID does not exist" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.646314 4773 scope.go:117] "RemoveContainer" containerID="7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.646615 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e\": container with ID starting with 7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e not found: ID does not exist" containerID="7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.646637 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e"} err="failed to get container status \"7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e\": rpc error: code = NotFound desc = could not find container \"7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e\": container with ID starting with 7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e not found: ID does not exist" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.646652 4773 scope.go:117] "RemoveContainer" containerID="4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77" Jan 20 18:50:18 crc kubenswrapper[4773]: E0120 18:50:18.646886 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77\": container with ID starting with 4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77 not found: ID does not exist" containerID="4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.647769 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77"} err="failed to get container status \"4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77\": rpc error: code = NotFound desc = could not find container \"4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77\": container with ID starting with 4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77 not found: ID does not exist" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.714750 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-run-httpd\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.715173 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.715248 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-scripts\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.715360 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.715417 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-config-data\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.715452 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lr7x\" (UniqueName: \"kubernetes.io/projected/06b19db9-fa8b-46db-a1fd-204fd44c86a5-kube-api-access-8lr7x\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.715585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-log-httpd\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.816804 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-run-httpd\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.816860 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.816908 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-scripts\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.816970 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.817002 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-config-data\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.817024 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lr7x\" (UniqueName: \"kubernetes.io/projected/06b19db9-fa8b-46db-a1fd-204fd44c86a5-kube-api-access-8lr7x\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.817046 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-log-httpd\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.817711 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-log-httpd\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.818363 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-run-httpd\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.821034 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.821701 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-config-data\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.821954 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.830338 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-scripts\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.835771 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lr7x\" (UniqueName: \"kubernetes.io/projected/06b19db9-fa8b-46db-a1fd-204fd44c86a5-kube-api-access-8lr7x\") pod \"ceilometer-0\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " pod="openstack/ceilometer-0" Jan 20 18:50:18 crc kubenswrapper[4773]: I0120 18:50:18.942153 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:19 crc kubenswrapper[4773]: I0120 18:50:19.376969 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:19 crc kubenswrapper[4773]: I0120 18:50:19.459647 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f5814cea-a704-4de4-9205-d65cde58c777" path="/var/lib/kubelet/pods/f5814cea-a704-4de4-9205-d65cde58c777/volumes" Jan 20 18:50:19 crc kubenswrapper[4773]: I0120 18:50:19.570750 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerStarted","Data":"e5ac9b64580a1a30d80a522428972a274182638cad0934f179acc61a069da4b1"} Jan 20 18:50:20 crc kubenswrapper[4773]: I0120 18:50:20.578381 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerStarted","Data":"3a15a5d730708632f96ecf7e5c3e4e5a1f04f422737b4d2afb09d5b06dd36275"} Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.011482 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd3a449_dc14_46ca_8e19_64d0a282483e.slice/crio-conmon-9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd3a449_dc14_46ca_8e19_64d0a282483e.slice/crio-conmon-9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.011846 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod833eac91_4269_4e1e_9923_8dd8ed2276dc.slice/crio-7358a078ad55ead91e8865f0bf5d80f239dd741b49799df9b4c64f0d2143c92a.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod833eac91_4269_4e1e_9923_8dd8ed2276dc.slice/crio-7358a078ad55ead91e8865f0bf5d80f239dd741b49799df9b4c64f0d2143c92a.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.011880 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7455911e_a1ad_442b_97b9_362496066bbf.slice/crio-conmon-ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7455911e_a1ad_442b_97b9_362496066bbf.slice/crio-conmon-ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.011901 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd3a449_dc14_46ca_8e19_64d0a282483e.slice/crio-9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd3a449_dc14_46ca_8e19_64d0a282483e.slice/crio-9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.011924 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7455911e_a1ad_442b_97b9_362496066bbf.slice/crio-ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7455911e_a1ad_442b_97b9_362496066bbf.slice/crio-ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.011965 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice/crio-9543425a6b98ecb5cb8e49dc0331eeeff3eaa05df6743d681a9e2302a8272c7c": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice/crio-9543425a6b98ecb5cb8e49dc0331eeeff3eaa05df6743d681a9e2302a8272c7c: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.012004 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice/crio-conmon-6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice/crio-conmon-6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: W0120 18:50:21.012044 4773 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice/crio-6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice/crio-6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2.scope: no such file or directory Jan 20 18:50:21 crc kubenswrapper[4773]: E0120 18:50:21.227558 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod833eac91_4269_4e1e_9923_8dd8ed2276dc.slice/crio-b755badae46a08f9cf38c4037c21c34427dd419272bb54ab07c468cc7dd674cf\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf4f47b18_303f_415d_8bf8_c1f7a075b747.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-conmon-cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86ae6a8c_2043_4e0f_a23f_43c998d3d9d7.slice/crio-b3a20b02dca0f94d69c52d5cde14a0b17023d664d2e0f60c310212d2d641b2e3\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod49df8cea_026f_497b_baae_a6a09452aa3d.slice/crio-conmon-7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd3a449_dc14_46ca_8e19_64d0a282483e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86ae6a8c_2043_4e0f_a23f_43c998d3d9d7.slice/crio-33a44114454454a182e314a103e4daecf90ecb7caed9c7572f0056b58d9567e3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7455911e_a1ad_442b_97b9_362496066bbf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47dcb7c9_ffa7_46bc_b695_02aea6e679a1.slice/crio-efbcae37c3969e8bca06ecc2eabe86724c6d9256d71d35fd4d834b35496de1dc\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod833eac91_4269_4e1e_9923_8dd8ed2276dc.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47dcb7c9_ffa7_46bc_b695_02aea6e679a1.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86ae6a8c_2043_4e0f_a23f_43c998d3d9d7.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-conmon-4dfec247bd1a7c1b2b638007722809007c65536e0d65164425d8cbabf5efbd77.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47dcb7c9_ffa7_46bc_b695_02aea6e679a1.slice/crio-conmon-2ccbf39de56ce437fa73601c164599a09e48be8a2f2534b1c75fa3d80b294c65.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod86ae6a8c_2043_4e0f_a23f_43c998d3d9d7.slice/crio-conmon-33a44114454454a182e314a103e4daecf90ecb7caed9c7572f0056b58d9567e3.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-conmon-7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod47dcb7c9_ffa7_46bc_b695_02aea6e679a1.slice/crio-2ccbf39de56ce437fa73601c164599a09e48be8a2f2534b1c75fa3d80b294c65.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-cb50e976a0c654a7216030978727b5d69177f7ab4404e98e26a34b6e55333be1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-conmon-2848aa1c2648bf46c2ea04e44fc541b3472a774f4d87ca163f256a1a307e1862.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-88338d24506cf39cc9754f352bf93432e909359610b5305d799ad52d2dc0901c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bd3a449_dc14_46ca_8e19_64d0a282483e.slice/crio-17f0c29c50eec97b536a2eb64b28c40b7df6c9fba7563133e287c40d3178ec5b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5814cea_a704_4de4_9205_d65cde58c777.slice/crio-7a3ab5105ad590a6668a70d5db1e361f420e137f18ebd3b24303e69e5a972b7e.scope\": RecentStats: unable to find data in memory cache]" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.513447 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.590349 4773 generic.go:334] "Generic (PLEG): container finished" podID="49df8cea-026f-497b-baae-a6a09452aa3d" containerID="7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909" exitCode=137 Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.590411 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-9b66d8476-cqhrd" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.590417 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b66d8476-cqhrd" event={"ID":"49df8cea-026f-497b-baae-a6a09452aa3d","Type":"ContainerDied","Data":"7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909"} Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.590536 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-9b66d8476-cqhrd" event={"ID":"49df8cea-026f-497b-baae-a6a09452aa3d","Type":"ContainerDied","Data":"45297b3ebf564b3f31091634eeea46bead8ac5c12e876ebfb7ba0eef2c596c1a"} Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.590569 4773 scope.go:117] "RemoveContainer" containerID="aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.592472 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerStarted","Data":"89ef9b3d9531aa5fb9cada883bd6d34709aa175f97980c0cc5f77140fe2f0651"} Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671303 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-scripts\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671397 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-tls-certs\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671532 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49df8cea-026f-497b-baae-a6a09452aa3d-logs\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671557 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-combined-ca-bundle\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671600 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-config-data\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671648 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-secret-key\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.671686 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhdcl\" (UniqueName: \"kubernetes.io/projected/49df8cea-026f-497b-baae-a6a09452aa3d-kube-api-access-mhdcl\") pod \"49df8cea-026f-497b-baae-a6a09452aa3d\" (UID: \"49df8cea-026f-497b-baae-a6a09452aa3d\") " Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.674568 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49df8cea-026f-497b-baae-a6a09452aa3d-logs" (OuterVolumeSpecName: "logs") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.677248 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49df8cea-026f-497b-baae-a6a09452aa3d-kube-api-access-mhdcl" (OuterVolumeSpecName: "kube-api-access-mhdcl") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "kube-api-access-mhdcl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.677664 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.699764 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-scripts" (OuterVolumeSpecName: "scripts") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.705133 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-config-data" (OuterVolumeSpecName: "config-data") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.723477 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.734503 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "49df8cea-026f-497b-baae-a6a09452aa3d" (UID: "49df8cea-026f-497b-baae-a6a09452aa3d"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.760153 4773 scope.go:117] "RemoveContainer" containerID="7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.773967 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/49df8cea-026f-497b-baae-a6a09452aa3d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.774008 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.774023 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.774034 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.774045 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhdcl\" (UniqueName: \"kubernetes.io/projected/49df8cea-026f-497b-baae-a6a09452aa3d-kube-api-access-mhdcl\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.774057 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/49df8cea-026f-497b-baae-a6a09452aa3d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.774067 4773 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/49df8cea-026f-497b-baae-a6a09452aa3d-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.809002 4773 scope.go:117] "RemoveContainer" containerID="aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4" Jan 20 18:50:21 crc kubenswrapper[4773]: E0120 18:50:21.809467 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4\": container with ID starting with aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4 not found: ID does not exist" containerID="aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.809502 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4"} err="failed to get container status \"aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4\": rpc error: code = NotFound desc = could not find container \"aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4\": container with ID starting with aab110865e342c13c6753a15789694fc55cd9167805325bf24c74b2765a8d8e4 not found: ID does not exist" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.809520 4773 scope.go:117] "RemoveContainer" containerID="7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909" Jan 20 18:50:21 crc kubenswrapper[4773]: E0120 18:50:21.809867 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909\": container with ID starting with 7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909 not found: ID does not exist" containerID="7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.809918 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909"} err="failed to get container status \"7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909\": rpc error: code = NotFound desc = could not find container \"7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909\": container with ID starting with 7ccb7f05b32cc6a8cf92a861d7cf4258f107127e8f0f1d25df715a6c2f51b909 not found: ID does not exist" Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.922123 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-9b66d8476-cqhrd"] Jan 20 18:50:21 crc kubenswrapper[4773]: I0120 18:50:21.929579 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-9b66d8476-cqhrd"] Jan 20 18:50:22 crc kubenswrapper[4773]: I0120 18:50:22.602893 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerStarted","Data":"3e022aa1b9dc9ce5e0ec757124ea29b13ef276de6c54d8c1322562dd4eb6b0e8"} Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.458143 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" path="/var/lib/kubelet/pods/49df8cea-026f-497b-baae-a6a09452aa3d/volumes" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.599523 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gvh4b"] Jan 20 18:50:23 crc kubenswrapper[4773]: E0120 18:50:23.600354 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon-log" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.600372 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon-log" Jan 20 18:50:23 crc kubenswrapper[4773]: E0120 18:50:23.600402 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.600409 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.600729 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon-log" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.600751 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="49df8cea-026f-497b-baae-a6a09452aa3d" containerName="horizon" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.601487 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.619545 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.620233 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bxxbt" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.620346 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.659719 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gvh4b"] Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.713183 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-scripts\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.713242 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.713299 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqrtd\" (UniqueName: \"kubernetes.io/projected/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-kube-api-access-mqrtd\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.713542 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-config-data\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.815567 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-scripts\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.816407 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.816456 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqrtd\" (UniqueName: \"kubernetes.io/projected/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-kube-api-access-mqrtd\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.816673 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-config-data\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.822339 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-scripts\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.822585 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-config-data\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.822587 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.833702 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqrtd\" (UniqueName: \"kubernetes.io/projected/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-kube-api-access-mqrtd\") pod \"nova-cell0-conductor-db-sync-gvh4b\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:23 crc kubenswrapper[4773]: I0120 18:50:23.965300 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:24 crc kubenswrapper[4773]: I0120 18:50:24.472887 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gvh4b"] Jan 20 18:50:24 crc kubenswrapper[4773]: I0120 18:50:24.649671 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerStarted","Data":"ab578131caeb84986c417f3673f2879b496e7defc6183cbb791021b2069c60c2"} Jan 20 18:50:24 crc kubenswrapper[4773]: I0120 18:50:24.650916 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:50:24 crc kubenswrapper[4773]: I0120 18:50:24.652023 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" event={"ID":"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4","Type":"ContainerStarted","Data":"c5dfab356b97be1469df8fe93ed3098db2831ed8b05cbbfa693e39760e22278b"} Jan 20 18:50:24 crc kubenswrapper[4773]: I0120 18:50:24.676748 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.604609129 podStartE2EDuration="6.67672198s" podCreationTimestamp="2026-01-20 18:50:18 +0000 UTC" firstStartedPulling="2026-01-20 18:50:19.388058042 +0000 UTC m=+1212.309871056" lastFinishedPulling="2026-01-20 18:50:23.460170883 +0000 UTC m=+1216.381983907" observedRunningTime="2026-01-20 18:50:24.672491709 +0000 UTC m=+1217.594304733" watchObservedRunningTime="2026-01-20 18:50:24.67672198 +0000 UTC m=+1217.598535024" Jan 20 18:50:30 crc kubenswrapper[4773]: I0120 18:50:30.710783 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:30 crc kubenswrapper[4773]: I0120 18:50:30.711531 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-central-agent" containerID="cri-o://3a15a5d730708632f96ecf7e5c3e4e5a1f04f422737b4d2afb09d5b06dd36275" gracePeriod=30 Jan 20 18:50:30 crc kubenswrapper[4773]: I0120 18:50:30.711639 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="proxy-httpd" containerID="cri-o://ab578131caeb84986c417f3673f2879b496e7defc6183cbb791021b2069c60c2" gracePeriod=30 Jan 20 18:50:30 crc kubenswrapper[4773]: I0120 18:50:30.711676 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="sg-core" containerID="cri-o://3e022aa1b9dc9ce5e0ec757124ea29b13ef276de6c54d8c1322562dd4eb6b0e8" gracePeriod=30 Jan 20 18:50:30 crc kubenswrapper[4773]: I0120 18:50:30.711709 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-notification-agent" containerID="cri-o://89ef9b3d9531aa5fb9cada883bd6d34709aa175f97980c0cc5f77140fe2f0651" gracePeriod=30 Jan 20 18:50:31 crc kubenswrapper[4773]: I0120 18:50:31.731047 4773 generic.go:334] "Generic (PLEG): container finished" podID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerID="ab578131caeb84986c417f3673f2879b496e7defc6183cbb791021b2069c60c2" exitCode=0 Jan 20 18:50:31 crc kubenswrapper[4773]: I0120 18:50:31.731363 4773 generic.go:334] "Generic (PLEG): container finished" podID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerID="3e022aa1b9dc9ce5e0ec757124ea29b13ef276de6c54d8c1322562dd4eb6b0e8" exitCode=2 Jan 20 18:50:31 crc kubenswrapper[4773]: I0120 18:50:31.731374 4773 generic.go:334] "Generic (PLEG): container finished" podID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerID="3a15a5d730708632f96ecf7e5c3e4e5a1f04f422737b4d2afb09d5b06dd36275" exitCode=0 Jan 20 18:50:31 crc kubenswrapper[4773]: I0120 18:50:31.731099 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerDied","Data":"ab578131caeb84986c417f3673f2879b496e7defc6183cbb791021b2069c60c2"} Jan 20 18:50:31 crc kubenswrapper[4773]: I0120 18:50:31.731434 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerDied","Data":"3e022aa1b9dc9ce5e0ec757124ea29b13ef276de6c54d8c1322562dd4eb6b0e8"} Jan 20 18:50:31 crc kubenswrapper[4773]: I0120 18:50:31.731466 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerDied","Data":"3a15a5d730708632f96ecf7e5c3e4e5a1f04f422737b4d2afb09d5b06dd36275"} Jan 20 18:50:33 crc kubenswrapper[4773]: I0120 18:50:33.757225 4773 generic.go:334] "Generic (PLEG): container finished" podID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerID="89ef9b3d9531aa5fb9cada883bd6d34709aa175f97980c0cc5f77140fe2f0651" exitCode=0 Jan 20 18:50:33 crc kubenswrapper[4773]: I0120 18:50:33.757312 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerDied","Data":"89ef9b3d9531aa5fb9cada883bd6d34709aa175f97980c0cc5f77140fe2f0651"} Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.414393 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575315 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-scripts\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575666 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-run-httpd\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575710 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-config-data\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575760 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lr7x\" (UniqueName: \"kubernetes.io/projected/06b19db9-fa8b-46db-a1fd-204fd44c86a5-kube-api-access-8lr7x\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575883 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-log-httpd\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575908 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-combined-ca-bundle\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.575963 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-sg-core-conf-yaml\") pod \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\" (UID: \"06b19db9-fa8b-46db-a1fd-204fd44c86a5\") " Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.576438 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.576531 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.577124 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.577144 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/06b19db9-fa8b-46db-a1fd-204fd44c86a5-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.581397 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06b19db9-fa8b-46db-a1fd-204fd44c86a5-kube-api-access-8lr7x" (OuterVolumeSpecName: "kube-api-access-8lr7x") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "kube-api-access-8lr7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.581630 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-scripts" (OuterVolumeSpecName: "scripts") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.602340 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.640862 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.672294 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-config-data" (OuterVolumeSpecName: "config-data") pod "06b19db9-fa8b-46db-a1fd-204fd44c86a5" (UID: "06b19db9-fa8b-46db-a1fd-204fd44c86a5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.678560 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.678592 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.678604 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.678613 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06b19db9-fa8b-46db-a1fd-204fd44c86a5-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.678622 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lr7x\" (UniqueName: \"kubernetes.io/projected/06b19db9-fa8b-46db-a1fd-204fd44c86a5-kube-api-access-8lr7x\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.778810 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" event={"ID":"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4","Type":"ContainerStarted","Data":"ee4d9edf9c01606465809a2b2e95bca37087aff51d7e53e121d12c29841bd18d"} Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.782631 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"06b19db9-fa8b-46db-a1fd-204fd44c86a5","Type":"ContainerDied","Data":"e5ac9b64580a1a30d80a522428972a274182638cad0934f179acc61a069da4b1"} Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.782680 4773 scope.go:117] "RemoveContainer" containerID="ab578131caeb84986c417f3673f2879b496e7defc6183cbb791021b2069c60c2" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.782786 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.804780 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" podStartSLOduration=2.049423102 podStartE2EDuration="12.804760715s" podCreationTimestamp="2026-01-20 18:50:23 +0000 UTC" firstStartedPulling="2026-01-20 18:50:24.490117729 +0000 UTC m=+1217.411930753" lastFinishedPulling="2026-01-20 18:50:35.245455342 +0000 UTC m=+1228.167268366" observedRunningTime="2026-01-20 18:50:35.799636802 +0000 UTC m=+1228.721449826" watchObservedRunningTime="2026-01-20 18:50:35.804760715 +0000 UTC m=+1228.726573739" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.807064 4773 scope.go:117] "RemoveContainer" containerID="3e022aa1b9dc9ce5e0ec757124ea29b13ef276de6c54d8c1322562dd4eb6b0e8" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.835134 4773 scope.go:117] "RemoveContainer" containerID="89ef9b3d9531aa5fb9cada883bd6d34709aa175f97980c0cc5f77140fe2f0651" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.845539 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.857823 4773 scope.go:117] "RemoveContainer" containerID="3a15a5d730708632f96ecf7e5c3e4e5a1f04f422737b4d2afb09d5b06dd36275" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.858004 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872041 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:35 crc kubenswrapper[4773]: E0120 18:50:35.872471 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="sg-core" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872491 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="sg-core" Jan 20 18:50:35 crc kubenswrapper[4773]: E0120 18:50:35.872513 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-central-agent" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872521 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-central-agent" Jan 20 18:50:35 crc kubenswrapper[4773]: E0120 18:50:35.872539 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="proxy-httpd" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872546 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="proxy-httpd" Jan 20 18:50:35 crc kubenswrapper[4773]: E0120 18:50:35.872562 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-notification-agent" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872570 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-notification-agent" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872817 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-notification-agent" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872842 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="sg-core" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872851 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="proxy-httpd" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.872861 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" containerName="ceilometer-central-agent" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.874533 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.877077 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.877302 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.881772 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983033 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-scripts\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983154 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983229 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-log-httpd\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983257 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85lhw\" (UniqueName: \"kubernetes.io/projected/72607859-440f-410c-baaf-0bef4e81dc3c-kube-api-access-85lhw\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983324 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-run-httpd\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983367 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:35 crc kubenswrapper[4773]: I0120 18:50:35.983429 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-config-data\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085088 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-scripts\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085178 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-log-httpd\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085196 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85lhw\" (UniqueName: \"kubernetes.io/projected/72607859-440f-410c-baaf-0bef4e81dc3c-kube-api-access-85lhw\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085240 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-run-httpd\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085271 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085299 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-config-data\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.085784 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-run-httpd\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.086107 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-log-httpd\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.089304 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.089503 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-scripts\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.089596 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-config-data\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.090076 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.102201 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85lhw\" (UniqueName: \"kubernetes.io/projected/72607859-440f-410c-baaf-0bef4e81dc3c-kube-api-access-85lhw\") pod \"ceilometer-0\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.192939 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:36 crc kubenswrapper[4773]: W0120 18:50:36.621214 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod72607859_440f_410c_baaf_0bef4e81dc3c.slice/crio-9859fcda55fda0eb635ac19d4fc026613ba2fbf4f8c6e8bec1aded60095fd2e7 WatchSource:0}: Error finding container 9859fcda55fda0eb635ac19d4fc026613ba2fbf4f8c6e8bec1aded60095fd2e7: Status 404 returned error can't find the container with id 9859fcda55fda0eb635ac19d4fc026613ba2fbf4f8c6e8bec1aded60095fd2e7 Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.622088 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:36 crc kubenswrapper[4773]: I0120 18:50:36.792132 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerStarted","Data":"9859fcda55fda0eb635ac19d4fc026613ba2fbf4f8c6e8bec1aded60095fd2e7"} Jan 20 18:50:37 crc kubenswrapper[4773]: I0120 18:50:37.458811 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06b19db9-fa8b-46db-a1fd-204fd44c86a5" path="/var/lib/kubelet/pods/06b19db9-fa8b-46db-a1fd-204fd44c86a5/volumes" Jan 20 18:50:37 crc kubenswrapper[4773]: I0120 18:50:37.801360 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerStarted","Data":"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0"} Jan 20 18:50:38 crc kubenswrapper[4773]: I0120 18:50:38.811891 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerStarted","Data":"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242"} Jan 20 18:50:39 crc kubenswrapper[4773]: I0120 18:50:39.827019 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerStarted","Data":"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59"} Jan 20 18:50:41 crc kubenswrapper[4773]: I0120 18:50:41.844836 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerStarted","Data":"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c"} Jan 20 18:50:41 crc kubenswrapper[4773]: I0120 18:50:41.846530 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:50:41 crc kubenswrapper[4773]: I0120 18:50:41.865538 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.516273181 podStartE2EDuration="6.865519404s" podCreationTimestamp="2026-01-20 18:50:35 +0000 UTC" firstStartedPulling="2026-01-20 18:50:36.623305286 +0000 UTC m=+1229.545118310" lastFinishedPulling="2026-01-20 18:50:40.972551509 +0000 UTC m=+1233.894364533" observedRunningTime="2026-01-20 18:50:41.863347542 +0000 UTC m=+1234.785160586" watchObservedRunningTime="2026-01-20 18:50:41.865519404 +0000 UTC m=+1234.787332428" Jan 20 18:50:42 crc kubenswrapper[4773]: I0120 18:50:42.318749 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:43 crc kubenswrapper[4773]: I0120 18:50:43.858416 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-notification-agent" containerID="cri-o://82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" gracePeriod=30 Jan 20 18:50:43 crc kubenswrapper[4773]: I0120 18:50:43.858428 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="sg-core" containerID="cri-o://425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" gracePeriod=30 Jan 20 18:50:43 crc kubenswrapper[4773]: I0120 18:50:43.858428 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="proxy-httpd" containerID="cri-o://ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" gracePeriod=30 Jan 20 18:50:43 crc kubenswrapper[4773]: I0120 18:50:43.859787 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-central-agent" containerID="cri-o://b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" gracePeriod=30 Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.770720 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876204 4773 generic.go:334] "Generic (PLEG): container finished" podID="72607859-440f-410c-baaf-0bef4e81dc3c" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" exitCode=0 Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876236 4773 generic.go:334] "Generic (PLEG): container finished" podID="72607859-440f-410c-baaf-0bef4e81dc3c" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" exitCode=2 Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876246 4773 generic.go:334] "Generic (PLEG): container finished" podID="72607859-440f-410c-baaf-0bef4e81dc3c" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" exitCode=0 Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876257 4773 generic.go:334] "Generic (PLEG): container finished" podID="72607859-440f-410c-baaf-0bef4e81dc3c" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" exitCode=0 Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876276 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerDied","Data":"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c"} Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876293 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876314 4773 scope.go:117] "RemoveContainer" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876302 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerDied","Data":"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59"} Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876415 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerDied","Data":"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242"} Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876433 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerDied","Data":"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0"} Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.876446 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"72607859-440f-410c-baaf-0bef4e81dc3c","Type":"ContainerDied","Data":"9859fcda55fda0eb635ac19d4fc026613ba2fbf4f8c6e8bec1aded60095fd2e7"} Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.880762 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-combined-ca-bundle\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.880959 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-config-data\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.881019 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-scripts\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.881060 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-sg-core-conf-yaml\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.881106 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-run-httpd\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.881137 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85lhw\" (UniqueName: \"kubernetes.io/projected/72607859-440f-410c-baaf-0bef4e81dc3c-kube-api-access-85lhw\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.881214 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-log-httpd\") pod \"72607859-440f-410c-baaf-0bef4e81dc3c\" (UID: \"72607859-440f-410c-baaf-0bef4e81dc3c\") " Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.881762 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.882536 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.886415 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-scripts" (OuterVolumeSpecName: "scripts") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.890111 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72607859-440f-410c-baaf-0bef4e81dc3c-kube-api-access-85lhw" (OuterVolumeSpecName: "kube-api-access-85lhw") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "kube-api-access-85lhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.898066 4773 scope.go:117] "RemoveContainer" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.910372 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.944142 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.976133 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-config-data" (OuterVolumeSpecName: "config-data") pod "72607859-440f-410c-baaf-0bef4e81dc3c" (UID: "72607859-440f-410c-baaf-0bef4e81dc3c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983584 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983609 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983618 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983628 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983638 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85lhw\" (UniqueName: \"kubernetes.io/projected/72607859-440f-410c-baaf-0bef4e81dc3c-kube-api-access-85lhw\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983646 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/72607859-440f-410c-baaf-0bef4e81dc3c-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.983657 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/72607859-440f-410c-baaf-0bef4e81dc3c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:45 crc kubenswrapper[4773]: I0120 18:50:45.984841 4773 scope.go:117] "RemoveContainer" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.004267 4773 scope.go:117] "RemoveContainer" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.021873 4773 scope.go:117] "RemoveContainer" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.022376 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": container with ID starting with ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c not found: ID does not exist" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.022431 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c"} err="failed to get container status \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": rpc error: code = NotFound desc = could not find container \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": container with ID starting with ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.022463 4773 scope.go:117] "RemoveContainer" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.022946 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": container with ID starting with 425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59 not found: ID does not exist" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.022980 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59"} err="failed to get container status \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": rpc error: code = NotFound desc = could not find container \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": container with ID starting with 425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.022995 4773 scope.go:117] "RemoveContainer" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.023331 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": container with ID starting with 82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242 not found: ID does not exist" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.023358 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242"} err="failed to get container status \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": rpc error: code = NotFound desc = could not find container \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": container with ID starting with 82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.023412 4773 scope.go:117] "RemoveContainer" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.023672 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": container with ID starting with b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0 not found: ID does not exist" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.023698 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0"} err="failed to get container status \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": rpc error: code = NotFound desc = could not find container \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": container with ID starting with b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.023717 4773 scope.go:117] "RemoveContainer" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.024012 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c"} err="failed to get container status \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": rpc error: code = NotFound desc = could not find container \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": container with ID starting with ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.024035 4773 scope.go:117] "RemoveContainer" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.024535 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59"} err="failed to get container status \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": rpc error: code = NotFound desc = could not find container \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": container with ID starting with 425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.024580 4773 scope.go:117] "RemoveContainer" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.024925 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242"} err="failed to get container status \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": rpc error: code = NotFound desc = could not find container \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": container with ID starting with 82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.025043 4773 scope.go:117] "RemoveContainer" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.026365 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0"} err="failed to get container status \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": rpc error: code = NotFound desc = could not find container \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": container with ID starting with b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.026396 4773 scope.go:117] "RemoveContainer" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.026633 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c"} err="failed to get container status \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": rpc error: code = NotFound desc = could not find container \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": container with ID starting with ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.026660 4773 scope.go:117] "RemoveContainer" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.026897 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59"} err="failed to get container status \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": rpc error: code = NotFound desc = could not find container \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": container with ID starting with 425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.026952 4773 scope.go:117] "RemoveContainer" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.027244 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242"} err="failed to get container status \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": rpc error: code = NotFound desc = could not find container \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": container with ID starting with 82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.027267 4773 scope.go:117] "RemoveContainer" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.027473 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0"} err="failed to get container status \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": rpc error: code = NotFound desc = could not find container \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": container with ID starting with b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.027494 4773 scope.go:117] "RemoveContainer" containerID="ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.027800 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c"} err="failed to get container status \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": rpc error: code = NotFound desc = could not find container \"ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c\": container with ID starting with ad64ca91f347f396852af2de556d336a95725fa6462dbec8f4fb1c6414c3e58c not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.027826 4773 scope.go:117] "RemoveContainer" containerID="425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.028062 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59"} err="failed to get container status \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": rpc error: code = NotFound desc = could not find container \"425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59\": container with ID starting with 425cd111ee6faee545673b5b331fc18b295c3cd572e5ae3fb91074a3557e8c59 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.028086 4773 scope.go:117] "RemoveContainer" containerID="82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.028293 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242"} err="failed to get container status \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": rpc error: code = NotFound desc = could not find container \"82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242\": container with ID starting with 82cc775c2fda5c90112e842cd2111097676d3e634ddaeee42caeb086c177b242 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.028347 4773 scope.go:117] "RemoveContainer" containerID="b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.028637 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0"} err="failed to get container status \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": rpc error: code = NotFound desc = could not find container \"b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0\": container with ID starting with b5e5e815b1e0cba44d73aa09525aba15062e4015bd8fe0d34407ef04614bbeb0 not found: ID does not exist" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.228800 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.245534 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.271650 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.272523 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-central-agent" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.273041 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-central-agent" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.274033 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="proxy-httpd" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274044 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="proxy-httpd" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.274063 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-notification-agent" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274138 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-notification-agent" Jan 20 18:50:46 crc kubenswrapper[4773]: E0120 18:50:46.274176 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="sg-core" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274182 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="sg-core" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274575 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="sg-core" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274594 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="proxy-httpd" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274609 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-notification-agent" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.274618 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" containerName="ceilometer-central-agent" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.284172 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.287599 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.288025 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.295168 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.391280 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.391767 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-scripts\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.391837 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-config-data\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.391872 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.391993 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-log-httpd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.392078 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rj2gd\" (UniqueName: \"kubernetes.io/projected/efb19124-8662-46a5-8fb4-7fbaeba8885a-kube-api-access-rj2gd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.392131 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-run-httpd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.493720 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-scripts\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.493779 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-config-data\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.493808 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.493868 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-log-httpd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.493956 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rj2gd\" (UniqueName: \"kubernetes.io/projected/efb19124-8662-46a5-8fb4-7fbaeba8885a-kube-api-access-rj2gd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.494014 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-run-httpd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.494060 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.494630 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-run-httpd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.496065 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-log-httpd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.498562 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-scripts\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.498618 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.498866 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-config-data\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.509816 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.515747 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rj2gd\" (UniqueName: \"kubernetes.io/projected/efb19124-8662-46a5-8fb4-7fbaeba8885a-kube-api-access-rj2gd\") pod \"ceilometer-0\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " pod="openstack/ceilometer-0" Jan 20 18:50:46 crc kubenswrapper[4773]: I0120 18:50:46.605654 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:50:47 crc kubenswrapper[4773]: I0120 18:50:47.046386 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:50:47 crc kubenswrapper[4773]: I0120 18:50:47.464794 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72607859-440f-410c-baaf-0bef4e81dc3c" path="/var/lib/kubelet/pods/72607859-440f-410c-baaf-0bef4e81dc3c/volumes" Jan 20 18:50:47 crc kubenswrapper[4773]: I0120 18:50:47.896384 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerStarted","Data":"9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086"} Jan 20 18:50:47 crc kubenswrapper[4773]: I0120 18:50:47.896983 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerStarted","Data":"54c5a63008f7d55fe728b88afe044f2f142c0ed42ff2bb1fe5fb615542c76281"} Jan 20 18:50:48 crc kubenswrapper[4773]: I0120 18:50:48.905433 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerStarted","Data":"331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643"} Jan 20 18:50:49 crc kubenswrapper[4773]: I0120 18:50:49.915627 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerStarted","Data":"1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c"} Jan 20 18:50:49 crc kubenswrapper[4773]: I0120 18:50:49.918222 4773 generic.go:334] "Generic (PLEG): container finished" podID="414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" containerID="ee4d9edf9c01606465809a2b2e95bca37087aff51d7e53e121d12c29841bd18d" exitCode=0 Jan 20 18:50:49 crc kubenswrapper[4773]: I0120 18:50:49.918363 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" event={"ID":"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4","Type":"ContainerDied","Data":"ee4d9edf9c01606465809a2b2e95bca37087aff51d7e53e121d12c29841bd18d"} Jan 20 18:50:50 crc kubenswrapper[4773]: I0120 18:50:50.929575 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerStarted","Data":"4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b"} Jan 20 18:50:50 crc kubenswrapper[4773]: I0120 18:50:50.969354 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.494172565 podStartE2EDuration="4.969333212s" podCreationTimestamp="2026-01-20 18:50:46 +0000 UTC" firstStartedPulling="2026-01-20 18:50:47.063215114 +0000 UTC m=+1239.985028138" lastFinishedPulling="2026-01-20 18:50:50.538375761 +0000 UTC m=+1243.460188785" observedRunningTime="2026-01-20 18:50:50.954881796 +0000 UTC m=+1243.876694900" watchObservedRunningTime="2026-01-20 18:50:50.969333212 +0000 UTC m=+1243.891146226" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.304783 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.475435 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-scripts\") pod \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.475538 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-config-data\") pod \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.475665 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-combined-ca-bundle\") pod \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.475701 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqrtd\" (UniqueName: \"kubernetes.io/projected/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-kube-api-access-mqrtd\") pod \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\" (UID: \"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4\") " Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.481303 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-scripts" (OuterVolumeSpecName: "scripts") pod "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" (UID: "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.481830 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-kube-api-access-mqrtd" (OuterVolumeSpecName: "kube-api-access-mqrtd") pod "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" (UID: "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4"). InnerVolumeSpecName "kube-api-access-mqrtd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.502723 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" (UID: "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.503725 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-config-data" (OuterVolumeSpecName: "config-data") pod "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" (UID: "414429bc-e43b-42f9-8f49-8bc7c4a0ecf4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.578175 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.578213 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.578230 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqrtd\" (UniqueName: \"kubernetes.io/projected/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-kube-api-access-mqrtd\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.578240 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.944182 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.944198 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-gvh4b" event={"ID":"414429bc-e43b-42f9-8f49-8bc7c4a0ecf4","Type":"ContainerDied","Data":"c5dfab356b97be1469df8fe93ed3098db2831ed8b05cbbfa693e39760e22278b"} Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.944570 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5dfab356b97be1469df8fe93ed3098db2831ed8b05cbbfa693e39760e22278b" Jan 20 18:50:51 crc kubenswrapper[4773]: I0120 18:50:51.944594 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.039175 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 20 18:50:52 crc kubenswrapper[4773]: E0120 18:50:52.039541 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" containerName="nova-cell0-conductor-db-sync" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.039559 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" containerName="nova-cell0-conductor-db-sync" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.039742 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" containerName="nova-cell0-conductor-db-sync" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.040349 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.042835 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-bxxbt" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.043049 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.049634 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.189614 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b123d99d-6cf6-4516-a5ae-7dcdf8262269-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.189658 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66gb9\" (UniqueName: \"kubernetes.io/projected/b123d99d-6cf6-4516-a5ae-7dcdf8262269-kube-api-access-66gb9\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.189706 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b123d99d-6cf6-4516-a5ae-7dcdf8262269-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.290877 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b123d99d-6cf6-4516-a5ae-7dcdf8262269-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.290963 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66gb9\" (UniqueName: \"kubernetes.io/projected/b123d99d-6cf6-4516-a5ae-7dcdf8262269-kube-api-access-66gb9\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.292117 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b123d99d-6cf6-4516-a5ae-7dcdf8262269-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.298206 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b123d99d-6cf6-4516-a5ae-7dcdf8262269-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.303539 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b123d99d-6cf6-4516-a5ae-7dcdf8262269-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.310035 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66gb9\" (UniqueName: \"kubernetes.io/projected/b123d99d-6cf6-4516-a5ae-7dcdf8262269-kube-api-access-66gb9\") pod \"nova-cell0-conductor-0\" (UID: \"b123d99d-6cf6-4516-a5ae-7dcdf8262269\") " pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.357500 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.779595 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Jan 20 18:50:52 crc kubenswrapper[4773]: I0120 18:50:52.949528 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b123d99d-6cf6-4516-a5ae-7dcdf8262269","Type":"ContainerStarted","Data":"445c8401390dc1adbdebe6fc61f046d0afd7a7608b18bab7d7255ed2043b193c"} Jan 20 18:50:53 crc kubenswrapper[4773]: I0120 18:50:53.970020 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"b123d99d-6cf6-4516-a5ae-7dcdf8262269","Type":"ContainerStarted","Data":"6e77b1c0d9b3b5447f7c98b4543a70f7a070b707e881dc87a843481b5e1cc0a4"} Jan 20 18:50:53 crc kubenswrapper[4773]: I0120 18:50:53.970384 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:53 crc kubenswrapper[4773]: I0120 18:50:53.989921 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=1.9899028520000002 podStartE2EDuration="1.989902852s" podCreationTimestamp="2026-01-20 18:50:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:50:53.986108031 +0000 UTC m=+1246.907921075" watchObservedRunningTime="2026-01-20 18:50:53.989902852 +0000 UTC m=+1246.911715896" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.382064 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.863578 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-p6rjg"] Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.864850 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.867703 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.874861 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-p6rjg"] Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.875647 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.998153 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxx8p\" (UniqueName: \"kubernetes.io/projected/61de3b4b-bcb7-4521-92e6-af87d03407ee-kube-api-access-zxx8p\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.998232 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-config-data\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.998249 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-scripts\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:57 crc kubenswrapper[4773]: I0120 18:50:57.998326 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.038957 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.040025 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.044468 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.051027 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.097842 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.099537 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.100251 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxx8p\" (UniqueName: \"kubernetes.io/projected/61de3b4b-bcb7-4521-92e6-af87d03407ee-kube-api-access-zxx8p\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.100335 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-config-data\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.100364 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-scripts\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.100486 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.108173 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.108585 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-scripts\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.112414 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.113671 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-config-data\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.114310 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.136561 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxx8p\" (UniqueName: \"kubernetes.io/projected/61de3b4b-bcb7-4521-92e6-af87d03407ee-kube-api-access-zxx8p\") pod \"nova-cell0-cell-mapping-p6rjg\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.204661 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206109 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkqxk\" (UniqueName: \"kubernetes.io/projected/b8ddb59d-f815-43b0-8d46-31575ad7703f-kube-api-access-xkqxk\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206178 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206217 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddb59d-f815-43b0-8d46-31575ad7703f-logs\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206270 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-npbs9\" (UniqueName: \"kubernetes.io/projected/cc9522a8-e87a-485b-85e6-9548b4f7c835-kube-api-access-npbs9\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206367 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206419 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.206493 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-config-data\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.247494 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.252062 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.254404 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.289306 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.308727 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkqxk\" (UniqueName: \"kubernetes.io/projected/b8ddb59d-f815-43b0-8d46-31575ad7703f-kube-api-access-xkqxk\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.308792 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.308817 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddb59d-f815-43b0-8d46-31575ad7703f-logs\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.308860 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-npbs9\" (UniqueName: \"kubernetes.io/projected/cc9522a8-e87a-485b-85e6-9548b4f7c835-kube-api-access-npbs9\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.308892 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.308922 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.313911 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddb59d-f815-43b0-8d46-31575ad7703f-logs\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.316421 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-config-data\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.322317 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.323542 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.326804 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.327984 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.339618 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-config-data\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.344704 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.348628 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-npbs9\" (UniqueName: \"kubernetes.io/projected/cc9522a8-e87a-485b-85e6-9548b4f7c835-kube-api-access-npbs9\") pod \"nova-cell1-novncproxy-0\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.349403 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkqxk\" (UniqueName: \"kubernetes.io/projected/b8ddb59d-f815-43b0-8d46-31575ad7703f-kube-api-access-xkqxk\") pod \"nova-api-0\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.350796 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.358190 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.361257 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.418383 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-config-data\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.418433 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk4pz\" (UniqueName: \"kubernetes.io/projected/770606ab-65d2-4537-a335-6953af47241a-kube-api-access-lk4pz\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.418489 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.422173 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-62h5v"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.423573 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.460974 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-62h5v"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524103 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524142 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524171 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-config-data\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524203 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6709a6f8-82d5-4b67-b4db-a35f9e88a664-logs\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524237 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524260 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-dns-svc\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.525946 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.524554 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zmxnz\" (UniqueName: \"kubernetes.io/projected/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-kube-api-access-zmxnz\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.529967 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-config-data\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.530046 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk4pz\" (UniqueName: \"kubernetes.io/projected/770606ab-65d2-4537-a335-6953af47241a-kube-api-access-lk4pz\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.530319 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7srwz\" (UniqueName: \"kubernetes.io/projected/6709a6f8-82d5-4b67-b4db-a35f9e88a664-kube-api-access-7srwz\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.530385 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.530448 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-config\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.538097 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-config-data\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.538601 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.561621 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk4pz\" (UniqueName: \"kubernetes.io/projected/770606ab-65d2-4537-a335-6953af47241a-kube-api-access-lk4pz\") pod \"nova-scheduler-0\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634623 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zmxnz\" (UniqueName: \"kubernetes.io/projected/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-kube-api-access-zmxnz\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634699 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7srwz\" (UniqueName: \"kubernetes.io/projected/6709a6f8-82d5-4b67-b4db-a35f9e88a664-kube-api-access-7srwz\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634726 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-config\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634757 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634777 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634795 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-config-data\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634816 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6709a6f8-82d5-4b67-b4db-a35f9e88a664-logs\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634838 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.634857 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-dns-svc\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.635948 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-dns-svc\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.636223 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6709a6f8-82d5-4b67-b4db-a35f9e88a664-logs\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.636300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-config\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.636594 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-nb\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.636760 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-sb\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.641838 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-config-data\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.646630 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.684631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7srwz\" (UniqueName: \"kubernetes.io/projected/6709a6f8-82d5-4b67-b4db-a35f9e88a664-kube-api-access-7srwz\") pod \"nova-metadata-0\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.684729 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zmxnz\" (UniqueName: \"kubernetes.io/projected/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-kube-api-access-zmxnz\") pod \"dnsmasq-dns-566b5b7845-62h5v\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.689570 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.718419 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.764644 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.841198 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-p6rjg"] Jan 20 18:50:58 crc kubenswrapper[4773]: I0120 18:50:58.964333 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.029413 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cc9522a8-e87a-485b-85e6-9548b4f7c835","Type":"ContainerStarted","Data":"c91e5fa12c0b7ce3e47c60fb183a0b0468674854812e9596d10c1c95981af1d7"} Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.031858 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-p6rjg" event={"ID":"61de3b4b-bcb7-4521-92e6-af87d03407ee","Type":"ContainerStarted","Data":"11bde0588e2d0753560751b7629f6dce99bb8a4665740e87d409f7d0f08a5c63"} Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.053399 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbmt7"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.054745 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.057278 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.057286 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.066272 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbmt7"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.079250 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.215589 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.243391 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.249771 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-config-data\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.250565 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.250806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-scripts\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.250892 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gzbk2\" (UniqueName: \"kubernetes.io/projected/9f9293b5-8288-4a19-b3ac-03d8026dbf06-kube-api-access-gzbk2\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.352014 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.352088 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-scripts\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.352132 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gzbk2\" (UniqueName: \"kubernetes.io/projected/9f9293b5-8288-4a19-b3ac-03d8026dbf06-kube-api-access-gzbk2\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.352203 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-config-data\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.361913 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.362284 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-config-data\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.372760 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-scripts\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.380208 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gzbk2\" (UniqueName: \"kubernetes.io/projected/9f9293b5-8288-4a19-b3ac-03d8026dbf06-kube-api-access-gzbk2\") pod \"nova-cell1-conductor-db-sync-qbmt7\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.509423 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-62h5v"] Jan 20 18:50:59 crc kubenswrapper[4773]: I0120 18:50:59.680610 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.040352 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6709a6f8-82d5-4b67-b4db-a35f9e88a664","Type":"ContainerStarted","Data":"135e3a1a083b2b65bf6bedbd7f61f04220039e408cb36b8c474ab0c91ec914a7"} Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.041327 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" event={"ID":"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc","Type":"ContainerStarted","Data":"a26dde9426d7bf6401c218a90f41c5f6ca8484f70a01b6ffde63301f200a5f7e"} Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.043013 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-p6rjg" event={"ID":"61de3b4b-bcb7-4521-92e6-af87d03407ee","Type":"ContainerStarted","Data":"8a98a5857f7bd2ab4f59a3057108741eeb668e092184b2f6b6f9df7ea5067a15"} Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.045220 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"770606ab-65d2-4537-a335-6953af47241a","Type":"ContainerStarted","Data":"18a3bac927d957a20e3b22301a468180e092097c9b7ed08358a71dea44a2f4fb"} Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.046162 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddb59d-f815-43b0-8d46-31575ad7703f","Type":"ContainerStarted","Data":"2a35ed65f92f9e8ee824da1249f99dade421d53e6d45bbd8bdafdaa51537c79e"} Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.066569 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-p6rjg" podStartSLOduration=3.066547582 podStartE2EDuration="3.066547582s" podCreationTimestamp="2026-01-20 18:50:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:00.057608768 +0000 UTC m=+1252.979421792" watchObservedRunningTime="2026-01-20 18:51:00.066547582 +0000 UTC m=+1252.988360616" Jan 20 18:51:00 crc kubenswrapper[4773]: I0120 18:51:00.162370 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbmt7"] Jan 20 18:51:01 crc kubenswrapper[4773]: I0120 18:51:01.057908 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" event={"ID":"9f9293b5-8288-4a19-b3ac-03d8026dbf06","Type":"ContainerStarted","Data":"c2e8a7f30db310da621535c0d420a0bac1190bc2a41a699499456b5aa096a1da"} Jan 20 18:51:01 crc kubenswrapper[4773]: I0120 18:51:01.935620 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:51:01 crc kubenswrapper[4773]: I0120 18:51:01.952879 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:02 crc kubenswrapper[4773]: I0120 18:51:02.068836 4773 generic.go:334] "Generic (PLEG): container finished" podID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerID="deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263" exitCode=0 Jan 20 18:51:02 crc kubenswrapper[4773]: I0120 18:51:02.068895 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" event={"ID":"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc","Type":"ContainerDied","Data":"deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263"} Jan 20 18:51:02 crc kubenswrapper[4773]: I0120 18:51:02.074182 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" event={"ID":"9f9293b5-8288-4a19-b3ac-03d8026dbf06","Type":"ContainerStarted","Data":"eb1a71418be4c383e084090dcdbe66bfe9ab929c68b106dc0659e9d31cfdbff6"} Jan 20 18:51:02 crc kubenswrapper[4773]: I0120 18:51:02.175051 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" podStartSLOduration=3.175033029 podStartE2EDuration="3.175033029s" podCreationTimestamp="2026-01-20 18:50:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:02.128780873 +0000 UTC m=+1255.050593897" watchObservedRunningTime="2026-01-20 18:51:02.175033029 +0000 UTC m=+1255.096846053" Jan 20 18:51:03 crc kubenswrapper[4773]: I0120 18:51:03.085959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" event={"ID":"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc","Type":"ContainerStarted","Data":"19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1"} Jan 20 18:51:03 crc kubenswrapper[4773]: I0120 18:51:03.086050 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:51:03 crc kubenswrapper[4773]: I0120 18:51:03.105042 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" podStartSLOduration=5.10502068 podStartE2EDuration="5.10502068s" podCreationTimestamp="2026-01-20 18:50:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:03.104361615 +0000 UTC m=+1256.026174639" watchObservedRunningTime="2026-01-20 18:51:03.10502068 +0000 UTC m=+1256.026833704" Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.122287 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"770606ab-65d2-4537-a335-6953af47241a","Type":"ContainerStarted","Data":"d3eb5774625308b3c5c8a9b8bf8152e39b3fdb0e54c1d18646bf90ab48ebc903"} Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.124181 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddb59d-f815-43b0-8d46-31575ad7703f","Type":"ContainerStarted","Data":"df867dbb67a40515bf2e917a26b8f035a0ceeae39ce2b877b7dc330bd0b26d0e"} Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.124223 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddb59d-f815-43b0-8d46-31575ad7703f","Type":"ContainerStarted","Data":"c8549044e1dee3b38135630de293aa60fb44321f9c7d2b5586fd3a805e16bfcc"} Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.126469 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6709a6f8-82d5-4b67-b4db-a35f9e88a664","Type":"ContainerStarted","Data":"671cecfe6a557b6b0e42ce8a57830a4448592fed4709c2353e23526c05bb6a9c"} Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.126510 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6709a6f8-82d5-4b67-b4db-a35f9e88a664","Type":"ContainerStarted","Data":"277ac84b5bc78b28a3aeda8f013775cbfac95bb8155862fb0fb7c6849fd81254"} Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.126604 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-log" containerID="cri-o://277ac84b5bc78b28a3aeda8f013775cbfac95bb8155862fb0fb7c6849fd81254" gracePeriod=30 Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.126630 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-metadata" containerID="cri-o://671cecfe6a557b6b0e42ce8a57830a4448592fed4709c2353e23526c05bb6a9c" gracePeriod=30 Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.131434 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cc9522a8-e87a-485b-85e6-9548b4f7c835","Type":"ContainerStarted","Data":"49b4d8e189097e49504e7ac626fc2c7b41cd127ac2c715bc575a36d1764b8b4e"} Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.131558 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="cc9522a8-e87a-485b-85e6-9548b4f7c835" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://49b4d8e189097e49504e7ac626fc2c7b41cd127ac2c715bc575a36d1764b8b4e" gracePeriod=30 Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.156752 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.060681417 podStartE2EDuration="9.156735342s" podCreationTimestamp="2026-01-20 18:50:58 +0000 UTC" firstStartedPulling="2026-01-20 18:50:59.222829514 +0000 UTC m=+1252.144642538" lastFinishedPulling="2026-01-20 18:51:05.318883439 +0000 UTC m=+1258.240696463" observedRunningTime="2026-01-20 18:51:07.153384822 +0000 UTC m=+1260.075197856" watchObservedRunningTime="2026-01-20 18:51:07.156735342 +0000 UTC m=+1260.078548366" Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.180256 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.834797263 podStartE2EDuration="9.180236894s" podCreationTimestamp="2026-01-20 18:50:58 +0000 UTC" firstStartedPulling="2026-01-20 18:50:58.975334774 +0000 UTC m=+1251.897147798" lastFinishedPulling="2026-01-20 18:51:05.320774405 +0000 UTC m=+1258.242587429" observedRunningTime="2026-01-20 18:51:07.172471478 +0000 UTC m=+1260.094284522" watchObservedRunningTime="2026-01-20 18:51:07.180236894 +0000 UTC m=+1260.102049918" Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.200360 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.134334489 podStartE2EDuration="9.200340875s" podCreationTimestamp="2026-01-20 18:50:58 +0000 UTC" firstStartedPulling="2026-01-20 18:50:59.251841889 +0000 UTC m=+1252.173654913" lastFinishedPulling="2026-01-20 18:51:05.317848275 +0000 UTC m=+1258.239661299" observedRunningTime="2026-01-20 18:51:07.196700398 +0000 UTC m=+1260.118513422" watchObservedRunningTime="2026-01-20 18:51:07.200340875 +0000 UTC m=+1260.122153899" Jan 20 18:51:07 crc kubenswrapper[4773]: I0120 18:51:07.219522 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.9697726429999998 podStartE2EDuration="9.219498884s" podCreationTimestamp="2026-01-20 18:50:58 +0000 UTC" firstStartedPulling="2026-01-20 18:50:59.067956349 +0000 UTC m=+1251.989769363" lastFinishedPulling="2026-01-20 18:51:05.31768258 +0000 UTC m=+1258.239495604" observedRunningTime="2026-01-20 18:51:07.217557466 +0000 UTC m=+1260.139370490" watchObservedRunningTime="2026-01-20 18:51:07.219498884 +0000 UTC m=+1260.141311908" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.152305 4773 generic.go:334] "Generic (PLEG): container finished" podID="61de3b4b-bcb7-4521-92e6-af87d03407ee" containerID="8a98a5857f7bd2ab4f59a3057108741eeb668e092184b2f6b6f9df7ea5067a15" exitCode=0 Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.152421 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-p6rjg" event={"ID":"61de3b4b-bcb7-4521-92e6-af87d03407ee","Type":"ContainerDied","Data":"8a98a5857f7bd2ab4f59a3057108741eeb668e092184b2f6b6f9df7ea5067a15"} Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.154650 4773 generic.go:334] "Generic (PLEG): container finished" podID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerID="671cecfe6a557b6b0e42ce8a57830a4448592fed4709c2353e23526c05bb6a9c" exitCode=0 Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.154665 4773 generic.go:334] "Generic (PLEG): container finished" podID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerID="277ac84b5bc78b28a3aeda8f013775cbfac95bb8155862fb0fb7c6849fd81254" exitCode=143 Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.155465 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6709a6f8-82d5-4b67-b4db-a35f9e88a664","Type":"ContainerDied","Data":"671cecfe6a557b6b0e42ce8a57830a4448592fed4709c2353e23526c05bb6a9c"} Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.155491 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6709a6f8-82d5-4b67-b4db-a35f9e88a664","Type":"ContainerDied","Data":"277ac84b5bc78b28a3aeda8f013775cbfac95bb8155862fb0fb7c6849fd81254"} Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.359625 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.484581 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.527101 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.527480 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.659827 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-config-data\") pod \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.659944 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7srwz\" (UniqueName: \"kubernetes.io/projected/6709a6f8-82d5-4b67-b4db-a35f9e88a664-kube-api-access-7srwz\") pod \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.659971 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6709a6f8-82d5-4b67-b4db-a35f9e88a664-logs\") pod \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.660008 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-combined-ca-bundle\") pod \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\" (UID: \"6709a6f8-82d5-4b67-b4db-a35f9e88a664\") " Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.661373 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6709a6f8-82d5-4b67-b4db-a35f9e88a664-logs" (OuterVolumeSpecName: "logs") pod "6709a6f8-82d5-4b67-b4db-a35f9e88a664" (UID: "6709a6f8-82d5-4b67-b4db-a35f9e88a664"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.665837 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6709a6f8-82d5-4b67-b4db-a35f9e88a664-kube-api-access-7srwz" (OuterVolumeSpecName: "kube-api-access-7srwz") pod "6709a6f8-82d5-4b67-b4db-a35f9e88a664" (UID: "6709a6f8-82d5-4b67-b4db-a35f9e88a664"). InnerVolumeSpecName "kube-api-access-7srwz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.690967 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.691004 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.691084 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6709a6f8-82d5-4b67-b4db-a35f9e88a664" (UID: "6709a6f8-82d5-4b67-b4db-a35f9e88a664"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.691104 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-config-data" (OuterVolumeSpecName: "config-data") pod "6709a6f8-82d5-4b67-b4db-a35f9e88a664" (UID: "6709a6f8-82d5-4b67-b4db-a35f9e88a664"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.726621 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.761616 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.761655 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7srwz\" (UniqueName: \"kubernetes.io/projected/6709a6f8-82d5-4b67-b4db-a35f9e88a664-kube-api-access-7srwz\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.761669 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6709a6f8-82d5-4b67-b4db-a35f9e88a664-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.761679 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6709a6f8-82d5-4b67-b4db-a35f9e88a664-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.766108 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.828423 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-fjqmj"] Jan 20 18:51:08 crc kubenswrapper[4773]: I0120 18:51:08.829221 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerName="dnsmasq-dns" containerID="cri-o://9e423ba2df684bfe239dce19b30ffd19312d8fa87386a3c9a43e0fb199ccc323" gracePeriod=10 Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.167858 4773 generic.go:334] "Generic (PLEG): container finished" podID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerID="9e423ba2df684bfe239dce19b30ffd19312d8fa87386a3c9a43e0fb199ccc323" exitCode=0 Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.167916 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" event={"ID":"93029dbe-6bb4-45aa-a72a-13e4ffc2537e","Type":"ContainerDied","Data":"9e423ba2df684bfe239dce19b30ffd19312d8fa87386a3c9a43e0fb199ccc323"} Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.170145 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.172826 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"6709a6f8-82d5-4b67-b4db-a35f9e88a664","Type":"ContainerDied","Data":"135e3a1a083b2b65bf6bedbd7f61f04220039e408cb36b8c474ab0c91ec914a7"} Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.172872 4773 scope.go:117] "RemoveContainer" containerID="671cecfe6a557b6b0e42ce8a57830a4448592fed4709c2353e23526c05bb6a9c" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.226653 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.227304 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.231529 4773 scope.go:117] "RemoveContainer" containerID="277ac84b5bc78b28a3aeda8f013775cbfac95bb8155862fb0fb7c6849fd81254" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.241806 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.283027 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:09 crc kubenswrapper[4773]: E0120 18:51:09.283427 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-log" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.283443 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-log" Jan 20 18:51:09 crc kubenswrapper[4773]: E0120 18:51:09.283477 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-metadata" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.283486 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-metadata" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.283676 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-log" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.283719 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" containerName="nova-metadata-metadata" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.284811 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.288149 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.292543 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.302513 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.389660 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.459116 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6709a6f8-82d5-4b67-b4db-a35f9e88a664" path="/var/lib/kubelet/pods/6709a6f8-82d5-4b67-b4db-a35f9e88a664/volumes" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.486143 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.486202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-logs\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.486279 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwkk9\" (UniqueName: \"kubernetes.io/projected/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-kube-api-access-rwkk9\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.486309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-config-data\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.486340 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.587443 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99tvf\" (UniqueName: \"kubernetes.io/projected/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-kube-api-access-99tvf\") pod \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.587774 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-nb\") pod \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.587845 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-sb\") pod \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.587916 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-config\") pod \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.588011 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-dns-svc\") pod \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\" (UID: \"93029dbe-6bb4-45aa-a72a-13e4ffc2537e\") " Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.588321 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwkk9\" (UniqueName: \"kubernetes.io/projected/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-kube-api-access-rwkk9\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.588438 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-config-data\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.588535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.588695 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.588790 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-logs\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.589266 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-logs\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.594268 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.595062 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-kube-api-access-99tvf" (OuterVolumeSpecName: "kube-api-access-99tvf") pod "93029dbe-6bb4-45aa-a72a-13e4ffc2537e" (UID: "93029dbe-6bb4-45aa-a72a-13e4ffc2537e"). InnerVolumeSpecName "kube-api-access-99tvf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.603697 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-config-data\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.610131 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.176:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.610367 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.176:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.610718 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.610746 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwkk9\" (UniqueName: \"kubernetes.io/projected/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-kube-api-access-rwkk9\") pod \"nova-metadata-0\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.642918 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "93029dbe-6bb4-45aa-a72a-13e4ffc2537e" (UID: "93029dbe-6bb4-45aa-a72a-13e4ffc2537e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.643258 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "93029dbe-6bb4-45aa-a72a-13e4ffc2537e" (UID: "93029dbe-6bb4-45aa-a72a-13e4ffc2537e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.648774 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "93029dbe-6bb4-45aa-a72a-13e4ffc2537e" (UID: "93029dbe-6bb4-45aa-a72a-13e4ffc2537e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.650020 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-config" (OuterVolumeSpecName: "config") pod "93029dbe-6bb4-45aa-a72a-13e4ffc2537e" (UID: "93029dbe-6bb4-45aa-a72a-13e4ffc2537e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.686194 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.694298 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.694486 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.694586 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.694678 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:09 crc kubenswrapper[4773]: I0120 18:51:09.694758 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99tvf\" (UniqueName: \"kubernetes.io/projected/93029dbe-6bb4-45aa-a72a-13e4ffc2537e-kube-api-access-99tvf\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.179060 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:10 crc kubenswrapper[4773]: W0120 18:51:10.193009 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode93e7391_bd5a_45f1_b0cd_15f8f39ba094.slice/crio-3530bfb5061bdf8a7a68265daea004d4e32c35437c51e4fd95a729d55e76c3e7 WatchSource:0}: Error finding container 3530bfb5061bdf8a7a68265daea004d4e32c35437c51e4fd95a729d55e76c3e7: Status 404 returned error can't find the container with id 3530bfb5061bdf8a7a68265daea004d4e32c35437c51e4fd95a729d55e76c3e7 Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.220120 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" event={"ID":"93029dbe-6bb4-45aa-a72a-13e4ffc2537e","Type":"ContainerDied","Data":"61e0e72231f57915e14eb84dde43eba2a10986a7eb1f548177c2131ae5e71eff"} Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.220172 4773 scope.go:117] "RemoveContainer" containerID="9e423ba2df684bfe239dce19b30ffd19312d8fa87386a3c9a43e0fb199ccc323" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.220262 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6d97fcdd8f-fjqmj" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.307689 4773 scope.go:117] "RemoveContainer" containerID="5b0075b2e498f8ecabb538d3ba4a3dfe5c7da84ee028f9b1a8729de23849abd7" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.311749 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-fjqmj"] Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.330228 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6d97fcdd8f-fjqmj"] Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.384517 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.515638 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-combined-ca-bundle\") pod \"61de3b4b-bcb7-4521-92e6-af87d03407ee\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.515696 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-scripts\") pod \"61de3b4b-bcb7-4521-92e6-af87d03407ee\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.515742 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxx8p\" (UniqueName: \"kubernetes.io/projected/61de3b4b-bcb7-4521-92e6-af87d03407ee-kube-api-access-zxx8p\") pod \"61de3b4b-bcb7-4521-92e6-af87d03407ee\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.515789 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-config-data\") pod \"61de3b4b-bcb7-4521-92e6-af87d03407ee\" (UID: \"61de3b4b-bcb7-4521-92e6-af87d03407ee\") " Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.520651 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61de3b4b-bcb7-4521-92e6-af87d03407ee-kube-api-access-zxx8p" (OuterVolumeSpecName: "kube-api-access-zxx8p") pod "61de3b4b-bcb7-4521-92e6-af87d03407ee" (UID: "61de3b4b-bcb7-4521-92e6-af87d03407ee"). InnerVolumeSpecName "kube-api-access-zxx8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.521170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-scripts" (OuterVolumeSpecName: "scripts") pod "61de3b4b-bcb7-4521-92e6-af87d03407ee" (UID: "61de3b4b-bcb7-4521-92e6-af87d03407ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.541981 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-config-data" (OuterVolumeSpecName: "config-data") pod "61de3b4b-bcb7-4521-92e6-af87d03407ee" (UID: "61de3b4b-bcb7-4521-92e6-af87d03407ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.548566 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "61de3b4b-bcb7-4521-92e6-af87d03407ee" (UID: "61de3b4b-bcb7-4521-92e6-af87d03407ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.617783 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.617819 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.617837 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61de3b4b-bcb7-4521-92e6-af87d03407ee-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:10 crc kubenswrapper[4773]: I0120 18:51:10.617848 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxx8p\" (UniqueName: \"kubernetes.io/projected/61de3b4b-bcb7-4521-92e6-af87d03407ee-kube-api-access-zxx8p\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.268912 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e93e7391-bd5a-45f1-b0cd-15f8f39ba094","Type":"ContainerStarted","Data":"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b"} Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.269278 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e93e7391-bd5a-45f1-b0cd-15f8f39ba094","Type":"ContainerStarted","Data":"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0"} Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.269296 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e93e7391-bd5a-45f1-b0cd-15f8f39ba094","Type":"ContainerStarted","Data":"3530bfb5061bdf8a7a68265daea004d4e32c35437c51e4fd95a729d55e76c3e7"} Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.270620 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-p6rjg" Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.270611 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-p6rjg" event={"ID":"61de3b4b-bcb7-4521-92e6-af87d03407ee","Type":"ContainerDied","Data":"11bde0588e2d0753560751b7629f6dce99bb8a4665740e87d409f7d0f08a5c63"} Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.270680 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11bde0588e2d0753560751b7629f6dce99bb8a4665740e87d409f7d0f08a5c63" Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.288486 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.288466848 podStartE2EDuration="2.288466848s" podCreationTimestamp="2026-01-20 18:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:11.287043514 +0000 UTC m=+1264.208856548" watchObservedRunningTime="2026-01-20 18:51:11.288466848 +0000 UTC m=+1264.210279882" Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.474600 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" path="/var/lib/kubelet/pods/93029dbe-6bb4-45aa-a72a-13e4ffc2537e/volumes" Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.572367 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.572590 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-log" containerID="cri-o://c8549044e1dee3b38135630de293aa60fb44321f9c7d2b5586fd3a805e16bfcc" gracePeriod=30 Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.573066 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-api" containerID="cri-o://df867dbb67a40515bf2e917a26b8f035a0ceeae39ce2b877b7dc330bd0b26d0e" gracePeriod=30 Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.585470 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:11 crc kubenswrapper[4773]: I0120 18:51:11.597764 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:12 crc kubenswrapper[4773]: I0120 18:51:12.280441 4773 generic.go:334] "Generic (PLEG): container finished" podID="9f9293b5-8288-4a19-b3ac-03d8026dbf06" containerID="eb1a71418be4c383e084090dcdbe66bfe9ab929c68b106dc0659e9d31cfdbff6" exitCode=0 Jan 20 18:51:12 crc kubenswrapper[4773]: I0120 18:51:12.280538 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" event={"ID":"9f9293b5-8288-4a19-b3ac-03d8026dbf06","Type":"ContainerDied","Data":"eb1a71418be4c383e084090dcdbe66bfe9ab929c68b106dc0659e9d31cfdbff6"} Jan 20 18:51:12 crc kubenswrapper[4773]: I0120 18:51:12.283036 4773 generic.go:334] "Generic (PLEG): container finished" podID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerID="c8549044e1dee3b38135630de293aa60fb44321f9c7d2b5586fd3a805e16bfcc" exitCode=143 Jan 20 18:51:12 crc kubenswrapper[4773]: I0120 18:51:12.283142 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddb59d-f815-43b0-8d46-31575ad7703f","Type":"ContainerDied","Data":"c8549044e1dee3b38135630de293aa60fb44321f9c7d2b5586fd3a805e16bfcc"} Jan 20 18:51:12 crc kubenswrapper[4773]: I0120 18:51:12.283408 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="770606ab-65d2-4537-a335-6953af47241a" containerName="nova-scheduler-scheduler" containerID="cri-o://d3eb5774625308b3c5c8a9b8bf8152e39b3fdb0e54c1d18646bf90ab48ebc903" gracePeriod=30 Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.292796 4773 generic.go:334] "Generic (PLEG): container finished" podID="770606ab-65d2-4537-a335-6953af47241a" containerID="d3eb5774625308b3c5c8a9b8bf8152e39b3fdb0e54c1d18646bf90ab48ebc903" exitCode=0 Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.292874 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"770606ab-65d2-4537-a335-6953af47241a","Type":"ContainerDied","Data":"d3eb5774625308b3c5c8a9b8bf8152e39b3fdb0e54c1d18646bf90ab48ebc903"} Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.293859 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-log" containerID="cri-o://c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0" gracePeriod=30 Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.294059 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-metadata" containerID="cri-o://7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b" gracePeriod=30 Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.573082 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.672323 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-combined-ca-bundle\") pod \"770606ab-65d2-4537-a335-6953af47241a\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.672414 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk4pz\" (UniqueName: \"kubernetes.io/projected/770606ab-65d2-4537-a335-6953af47241a-kube-api-access-lk4pz\") pod \"770606ab-65d2-4537-a335-6953af47241a\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.672507 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-config-data\") pod \"770606ab-65d2-4537-a335-6953af47241a\" (UID: \"770606ab-65d2-4537-a335-6953af47241a\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.678578 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/770606ab-65d2-4537-a335-6953af47241a-kube-api-access-lk4pz" (OuterVolumeSpecName: "kube-api-access-lk4pz") pod "770606ab-65d2-4537-a335-6953af47241a" (UID: "770606ab-65d2-4537-a335-6953af47241a"). InnerVolumeSpecName "kube-api-access-lk4pz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.699651 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "770606ab-65d2-4537-a335-6953af47241a" (UID: "770606ab-65d2-4537-a335-6953af47241a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.707298 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-config-data" (OuterVolumeSpecName: "config-data") pod "770606ab-65d2-4537-a335-6953af47241a" (UID: "770606ab-65d2-4537-a335-6953af47241a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.715659 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.774657 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.774689 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/770606ab-65d2-4537-a335-6953af47241a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.774701 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk4pz\" (UniqueName: \"kubernetes.io/projected/770606ab-65d2-4537-a335-6953af47241a-kube-api-access-lk4pz\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.849870 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.876340 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-combined-ca-bundle\") pod \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.876391 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-config-data\") pod \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.876462 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gzbk2\" (UniqueName: \"kubernetes.io/projected/9f9293b5-8288-4a19-b3ac-03d8026dbf06-kube-api-access-gzbk2\") pod \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.876529 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-scripts\") pod \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\" (UID: \"9f9293b5-8288-4a19-b3ac-03d8026dbf06\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.880268 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-scripts" (OuterVolumeSpecName: "scripts") pod "9f9293b5-8288-4a19-b3ac-03d8026dbf06" (UID: "9f9293b5-8288-4a19-b3ac-03d8026dbf06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.880793 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f9293b5-8288-4a19-b3ac-03d8026dbf06-kube-api-access-gzbk2" (OuterVolumeSpecName: "kube-api-access-gzbk2") pod "9f9293b5-8288-4a19-b3ac-03d8026dbf06" (UID: "9f9293b5-8288-4a19-b3ac-03d8026dbf06"). InnerVolumeSpecName "kube-api-access-gzbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.901235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-config-data" (OuterVolumeSpecName: "config-data") pod "9f9293b5-8288-4a19-b3ac-03d8026dbf06" (UID: "9f9293b5-8288-4a19-b3ac-03d8026dbf06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.902766 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9f9293b5-8288-4a19-b3ac-03d8026dbf06" (UID: "9f9293b5-8288-4a19-b3ac-03d8026dbf06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.977888 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-logs\") pod \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.977984 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwkk9\" (UniqueName: \"kubernetes.io/projected/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-kube-api-access-rwkk9\") pod \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978053 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-combined-ca-bundle\") pod \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978126 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-config-data\") pod \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978194 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-nova-metadata-tls-certs\") pod \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\" (UID: \"e93e7391-bd5a-45f1-b0cd-15f8f39ba094\") " Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978630 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978646 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978655 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9f9293b5-8288-4a19-b3ac-03d8026dbf06-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978664 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gzbk2\" (UniqueName: \"kubernetes.io/projected/9f9293b5-8288-4a19-b3ac-03d8026dbf06-kube-api-access-gzbk2\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.978960 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-logs" (OuterVolumeSpecName: "logs") pod "e93e7391-bd5a-45f1-b0cd-15f8f39ba094" (UID: "e93e7391-bd5a-45f1-b0cd-15f8f39ba094"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:13 crc kubenswrapper[4773]: I0120 18:51:13.981546 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-kube-api-access-rwkk9" (OuterVolumeSpecName: "kube-api-access-rwkk9") pod "e93e7391-bd5a-45f1-b0cd-15f8f39ba094" (UID: "e93e7391-bd5a-45f1-b0cd-15f8f39ba094"). InnerVolumeSpecName "kube-api-access-rwkk9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.002025 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-config-data" (OuterVolumeSpecName: "config-data") pod "e93e7391-bd5a-45f1-b0cd-15f8f39ba094" (UID: "e93e7391-bd5a-45f1-b0cd-15f8f39ba094"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.005282 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e93e7391-bd5a-45f1-b0cd-15f8f39ba094" (UID: "e93e7391-bd5a-45f1-b0cd-15f8f39ba094"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.027078 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "e93e7391-bd5a-45f1-b0cd-15f8f39ba094" (UID: "e93e7391-bd5a-45f1-b0cd-15f8f39ba094"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.080040 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.080093 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwkk9\" (UniqueName: \"kubernetes.io/projected/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-kube-api-access-rwkk9\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.080109 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.080120 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.080132 4773 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/e93e7391-bd5a-45f1-b0cd-15f8f39ba094-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.304984 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.304979 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-qbmt7" event={"ID":"9f9293b5-8288-4a19-b3ac-03d8026dbf06","Type":"ContainerDied","Data":"c2e8a7f30db310da621535c0d420a0bac1190bc2a41a699499456b5aa096a1da"} Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.305166 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c2e8a7f30db310da621535c0d420a0bac1190bc2a41a699499456b5aa096a1da" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.307622 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"770606ab-65d2-4537-a335-6953af47241a","Type":"ContainerDied","Data":"18a3bac927d957a20e3b22301a468180e092097c9b7ed08358a71dea44a2f4fb"} Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.307682 4773 scope.go:117] "RemoveContainer" containerID="d3eb5774625308b3c5c8a9b8bf8152e39b3fdb0e54c1d18646bf90ab48ebc903" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.307819 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.315437 4773 generic.go:334] "Generic (PLEG): container finished" podID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerID="7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b" exitCode=0 Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.315493 4773 generic.go:334] "Generic (PLEG): container finished" podID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerID="c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0" exitCode=143 Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.315518 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e93e7391-bd5a-45f1-b0cd-15f8f39ba094","Type":"ContainerDied","Data":"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b"} Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.315547 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e93e7391-bd5a-45f1-b0cd-15f8f39ba094","Type":"ContainerDied","Data":"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0"} Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.315561 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"e93e7391-bd5a-45f1-b0cd-15f8f39ba094","Type":"ContainerDied","Data":"3530bfb5061bdf8a7a68265daea004d4e32c35437c51e4fd95a729d55e76c3e7"} Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.315632 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.348759 4773 scope.go:117] "RemoveContainer" containerID="7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.467527 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468157 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f9293b5-8288-4a19-b3ac-03d8026dbf06" containerName="nova-cell1-conductor-db-sync" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468175 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f9293b5-8288-4a19-b3ac-03d8026dbf06" containerName="nova-cell1-conductor-db-sync" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468190 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerName="init" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468196 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerName="init" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468206 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerName="dnsmasq-dns" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468212 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerName="dnsmasq-dns" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468223 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-metadata" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468229 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-metadata" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468242 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-log" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468247 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-log" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468259 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61de3b4b-bcb7-4521-92e6-af87d03407ee" containerName="nova-manage" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468264 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="61de3b4b-bcb7-4521-92e6-af87d03407ee" containerName="nova-manage" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.468283 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="770606ab-65d2-4537-a335-6953af47241a" containerName="nova-scheduler-scheduler" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468289 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="770606ab-65d2-4537-a335-6953af47241a" containerName="nova-scheduler-scheduler" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468431 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="61de3b4b-bcb7-4521-92e6-af87d03407ee" containerName="nova-manage" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468457 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f9293b5-8288-4a19-b3ac-03d8026dbf06" containerName="nova-cell1-conductor-db-sync" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468465 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="770606ab-65d2-4537-a335-6953af47241a" containerName="nova-scheduler-scheduler" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468480 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="93029dbe-6bb4-45aa-a72a-13e4ffc2537e" containerName="dnsmasq-dns" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468495 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-log" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.468516 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" containerName="nova-metadata-metadata" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.469081 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.477881 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.478005 4773 scope.go:117] "RemoveContainer" containerID="c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.500222 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.521071 4773 scope.go:117] "RemoveContainer" containerID="7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.521311 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.521890 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b\": container with ID starting with 7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b not found: ID does not exist" containerID="7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.521916 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b"} err="failed to get container status \"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b\": rpc error: code = NotFound desc = could not find container \"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b\": container with ID starting with 7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b not found: ID does not exist" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.521959 4773 scope.go:117] "RemoveContainer" containerID="c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0" Jan 20 18:51:14 crc kubenswrapper[4773]: E0120 18:51:14.522220 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0\": container with ID starting with c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0 not found: ID does not exist" containerID="c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.522242 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0"} err="failed to get container status \"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0\": rpc error: code = NotFound desc = could not find container \"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0\": container with ID starting with c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0 not found: ID does not exist" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.522256 4773 scope.go:117] "RemoveContainer" containerID="7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.522502 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b"} err="failed to get container status \"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b\": rpc error: code = NotFound desc = could not find container \"7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b\": container with ID starting with 7010035b377b66d4de67bd919469a71355b7ae52e07affd05148def6e9038f4b not found: ID does not exist" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.522516 4773 scope.go:117] "RemoveContainer" containerID="c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.522697 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0"} err="failed to get container status \"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0\": rpc error: code = NotFound desc = could not find container \"c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0\": container with ID starting with c84468244a5ecd4949ff3fcccf264d620bae45d62f3bb7dafe3f87f5654822b0 not found: ID does not exist" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.540257 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.554596 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.556490 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.560631 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.561139 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.572137 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.579165 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.596893 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rvdlt\" (UniqueName: \"kubernetes.io/projected/7970e552-0aac-436b-ba20-4810e82dcd20-kube-api-access-rvdlt\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.596981 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7970e552-0aac-436b-ba20-4810e82dcd20-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.597004 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7970e552-0aac-436b-ba20-4810e82dcd20-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.611335 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.613673 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.616179 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.616966 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.621113 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.698839 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.698891 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rvdlt\" (UniqueName: \"kubernetes.io/projected/7970e552-0aac-436b-ba20-4810e82dcd20-kube-api-access-rvdlt\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.698944 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7970e552-0aac-436b-ba20-4810e82dcd20-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.698966 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7970e552-0aac-436b-ba20-4810e82dcd20-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.699039 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-config-data\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.699112 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4rnl\" (UniqueName: \"kubernetes.io/projected/8a5e1be7-c022-49b6-aa10-d23451918579-kube-api-access-n4rnl\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.703832 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7970e552-0aac-436b-ba20-4810e82dcd20-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.705500 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7970e552-0aac-436b-ba20-4810e82dcd20-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.715165 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rvdlt\" (UniqueName: \"kubernetes.io/projected/7970e552-0aac-436b-ba20-4810e82dcd20-kube-api-access-rvdlt\") pod \"nova-cell1-conductor-0\" (UID: \"7970e552-0aac-436b-ba20-4810e82dcd20\") " pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.797531 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800269 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4rnl\" (UniqueName: \"kubernetes.io/projected/8a5e1be7-c022-49b6-aa10-d23451918579-kube-api-access-n4rnl\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800338 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800375 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9c6096-2ce8-4b43-a638-50374d21d621-logs\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800422 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800499 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800522 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-config-data\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800559 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-config-data\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.800604 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tgw2z\" (UniqueName: \"kubernetes.io/projected/fb9c6096-2ce8-4b43-a638-50374d21d621-kube-api-access-tgw2z\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.803563 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.803910 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-config-data\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.825177 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4rnl\" (UniqueName: \"kubernetes.io/projected/8a5e1be7-c022-49b6-aa10-d23451918579-kube-api-access-n4rnl\") pod \"nova-scheduler-0\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.883164 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.903092 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.903404 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-config-data\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.903469 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tgw2z\" (UniqueName: \"kubernetes.io/projected/fb9c6096-2ce8-4b43-a638-50374d21d621-kube-api-access-tgw2z\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.903520 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.903545 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9c6096-2ce8-4b43-a638-50374d21d621-logs\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.904275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9c6096-2ce8-4b43-a638-50374d21d621-logs\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.909067 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.913068 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-config-data\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.913859 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.923220 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tgw2z\" (UniqueName: \"kubernetes.io/projected/fb9c6096-2ce8-4b43-a638-50374d21d621-kube-api-access-tgw2z\") pod \"nova-metadata-0\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " pod="openstack/nova-metadata-0" Jan 20 18:51:14 crc kubenswrapper[4773]: I0120 18:51:14.931514 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.248806 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.327750 4773 generic.go:334] "Generic (PLEG): container finished" podID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerID="df867dbb67a40515bf2e917a26b8f035a0ceeae39ce2b877b7dc330bd0b26d0e" exitCode=0 Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.327793 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddb59d-f815-43b0-8d46-31575ad7703f","Type":"ContainerDied","Data":"df867dbb67a40515bf2e917a26b8f035a0ceeae39ce2b877b7dc330bd0b26d0e"} Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.331710 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7970e552-0aac-436b-ba20-4810e82dcd20","Type":"ContainerStarted","Data":"2ab19617b49acc68e33070c92ef95686d716679192c7e1bb7c90563b37a8db42"} Jan 20 18:51:15 crc kubenswrapper[4773]: W0120 18:51:15.385332 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb9c6096_2ce8_4b43_a638_50374d21d621.slice/crio-1fbd29fc0590b0cd2b67866d481ca79cacd0b7e614ff050ba774f3244fed5e4a WatchSource:0}: Error finding container 1fbd29fc0590b0cd2b67866d481ca79cacd0b7e614ff050ba774f3244fed5e4a: Status 404 returned error can't find the container with id 1fbd29fc0590b0cd2b67866d481ca79cacd0b7e614ff050ba774f3244fed5e4a Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.387803 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.465389 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="770606ab-65d2-4537-a335-6953af47241a" path="/var/lib/kubelet/pods/770606ab-65d2-4537-a335-6953af47241a/volumes" Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.466175 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93e7391-bd5a-45f1-b0cd-15f8f39ba094" path="/var/lib/kubelet/pods/e93e7391-bd5a-45f1-b0cd-15f8f39ba094/volumes" Jan 20 18:51:15 crc kubenswrapper[4773]: I0120 18:51:15.466889 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.286605 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.341196 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"b8ddb59d-f815-43b0-8d46-31575ad7703f","Type":"ContainerDied","Data":"2a35ed65f92f9e8ee824da1249f99dade421d53e6d45bbd8bdafdaa51537c79e"} Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.341232 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.341261 4773 scope.go:117] "RemoveContainer" containerID="df867dbb67a40515bf2e917a26b8f035a0ceeae39ce2b877b7dc330bd0b26d0e" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.343631 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb9c6096-2ce8-4b43-a638-50374d21d621","Type":"ContainerStarted","Data":"1fbd29fc0590b0cd2b67866d481ca79cacd0b7e614ff050ba774f3244fed5e4a"} Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.345528 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"7970e552-0aac-436b-ba20-4810e82dcd20","Type":"ContainerStarted","Data":"6c46258b994452cf856ed0515ba72c98f2dc18e32acd58b9888553b3e54d162a"} Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.346409 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a5e1be7-c022-49b6-aa10-d23451918579","Type":"ContainerStarted","Data":"220ce0272e6c403af39eafd69462c79884cab6d380e200929ec876ec12a03a06"} Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.362280 4773 scope.go:117] "RemoveContainer" containerID="c8549044e1dee3b38135630de293aa60fb44321f9c7d2b5586fd3a805e16bfcc" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.426510 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xkqxk\" (UniqueName: \"kubernetes.io/projected/b8ddb59d-f815-43b0-8d46-31575ad7703f-kube-api-access-xkqxk\") pod \"b8ddb59d-f815-43b0-8d46-31575ad7703f\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.426552 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddb59d-f815-43b0-8d46-31575ad7703f-logs\") pod \"b8ddb59d-f815-43b0-8d46-31575ad7703f\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.426584 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-combined-ca-bundle\") pod \"b8ddb59d-f815-43b0-8d46-31575ad7703f\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.426627 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-config-data\") pod \"b8ddb59d-f815-43b0-8d46-31575ad7703f\" (UID: \"b8ddb59d-f815-43b0-8d46-31575ad7703f\") " Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.427845 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b8ddb59d-f815-43b0-8d46-31575ad7703f-logs" (OuterVolumeSpecName: "logs") pod "b8ddb59d-f815-43b0-8d46-31575ad7703f" (UID: "b8ddb59d-f815-43b0-8d46-31575ad7703f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.430876 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ddb59d-f815-43b0-8d46-31575ad7703f-kube-api-access-xkqxk" (OuterVolumeSpecName: "kube-api-access-xkqxk") pod "b8ddb59d-f815-43b0-8d46-31575ad7703f" (UID: "b8ddb59d-f815-43b0-8d46-31575ad7703f"). InnerVolumeSpecName "kube-api-access-xkqxk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.457661 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-config-data" (OuterVolumeSpecName: "config-data") pod "b8ddb59d-f815-43b0-8d46-31575ad7703f" (UID: "b8ddb59d-f815-43b0-8d46-31575ad7703f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.462701 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b8ddb59d-f815-43b0-8d46-31575ad7703f" (UID: "b8ddb59d-f815-43b0-8d46-31575ad7703f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.528636 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xkqxk\" (UniqueName: \"kubernetes.io/projected/b8ddb59d-f815-43b0-8d46-31575ad7703f-kube-api-access-xkqxk\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.528678 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b8ddb59d-f815-43b0-8d46-31575ad7703f-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.528691 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.528708 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b8ddb59d-f815-43b0-8d46-31575ad7703f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.620374 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.718043 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.733158 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.741742 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:16 crc kubenswrapper[4773]: E0120 18:51:16.742349 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-api" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.742376 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-api" Jan 20 18:51:16 crc kubenswrapper[4773]: E0120 18:51:16.742403 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-log" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.742412 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-log" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.742695 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-log" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.742719 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" containerName="nova-api-api" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.747340 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.751187 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.753036 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.833223 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-config-data\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.833262 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3317e-9e4d-442d-a5b2-d9633262f332-logs\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.833317 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smh5x\" (UniqueName: \"kubernetes.io/projected/79f3317e-9e4d-442d-a5b2-d9633262f332-kube-api-access-smh5x\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.833338 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.934817 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-config-data\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.934864 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3317e-9e4d-442d-a5b2-d9633262f332-logs\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.934989 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-smh5x\" (UniqueName: \"kubernetes.io/projected/79f3317e-9e4d-442d-a5b2-d9633262f332-kube-api-access-smh5x\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.935015 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.935727 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3317e-9e4d-442d-a5b2-d9633262f332-logs\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.943602 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-config-data\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.944397 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:16 crc kubenswrapper[4773]: I0120 18:51:16.956113 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-smh5x\" (UniqueName: \"kubernetes.io/projected/79f3317e-9e4d-442d-a5b2-d9633262f332-kube-api-access-smh5x\") pod \"nova-api-0\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " pod="openstack/nova-api-0" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.066215 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.359163 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb9c6096-2ce8-4b43-a638-50374d21d621","Type":"ContainerStarted","Data":"f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14"} Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.359206 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb9c6096-2ce8-4b43-a638-50374d21d621","Type":"ContainerStarted","Data":"e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d"} Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.363307 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a5e1be7-c022-49b6-aa10-d23451918579","Type":"ContainerStarted","Data":"52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b"} Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.366312 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.387145 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.387120615 podStartE2EDuration="3.387120615s" podCreationTimestamp="2026-01-20 18:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:17.381782037 +0000 UTC m=+1270.303595061" watchObservedRunningTime="2026-01-20 18:51:17.387120615 +0000 UTC m=+1270.308933639" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.401273 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.401230892 podStartE2EDuration="3.401230892s" podCreationTimestamp="2026-01-20 18:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:17.399004589 +0000 UTC m=+1270.320817633" watchObservedRunningTime="2026-01-20 18:51:17.401230892 +0000 UTC m=+1270.323043916" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.422874 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=3.422847219 podStartE2EDuration="3.422847219s" podCreationTimestamp="2026-01-20 18:51:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:17.414663833 +0000 UTC m=+1270.336476877" watchObservedRunningTime="2026-01-20 18:51:17.422847219 +0000 UTC m=+1270.344660243" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.457867 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ddb59d-f815-43b0-8d46-31575ad7703f" path="/var/lib/kubelet/pods/b8ddb59d-f815-43b0-8d46-31575ad7703f/volumes" Jan 20 18:51:17 crc kubenswrapper[4773]: I0120 18:51:17.520213 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:17 crc kubenswrapper[4773]: W0120 18:51:17.528571 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod79f3317e_9e4d_442d_a5b2_d9633262f332.slice/crio-4073b0927686969a0c5927a5f8897ae5c028636891b4d7b257d94400258ebb80 WatchSource:0}: Error finding container 4073b0927686969a0c5927a5f8897ae5c028636891b4d7b257d94400258ebb80: Status 404 returned error can't find the container with id 4073b0927686969a0c5927a5f8897ae5c028636891b4d7b257d94400258ebb80 Jan 20 18:51:18 crc kubenswrapper[4773]: I0120 18:51:18.385643 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f3317e-9e4d-442d-a5b2-d9633262f332","Type":"ContainerStarted","Data":"b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25"} Jan 20 18:51:18 crc kubenswrapper[4773]: I0120 18:51:18.385998 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f3317e-9e4d-442d-a5b2-d9633262f332","Type":"ContainerStarted","Data":"4073b0927686969a0c5927a5f8897ae5c028636891b4d7b257d94400258ebb80"} Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.394777 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f3317e-9e4d-442d-a5b2-d9633262f332","Type":"ContainerStarted","Data":"f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041"} Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.431496 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.431474128 podStartE2EDuration="3.431474128s" podCreationTimestamp="2026-01-20 18:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:19.416706134 +0000 UTC m=+1272.338519158" watchObservedRunningTime="2026-01-20 18:51:19.431474128 +0000 UTC m=+1272.353287142" Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.439872 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.440088 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" containerName="kube-state-metrics" containerID="cri-o://0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382" gracePeriod=30 Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.883814 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.921908 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.931695 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.931747 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.987790 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jjqgh\" (UniqueName: \"kubernetes.io/projected/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae-kube-api-access-jjqgh\") pod \"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae\" (UID: \"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae\") " Jan 20 18:51:19 crc kubenswrapper[4773]: I0120 18:51:19.993826 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae-kube-api-access-jjqgh" (OuterVolumeSpecName: "kube-api-access-jjqgh") pod "d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" (UID: "d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae"). InnerVolumeSpecName "kube-api-access-jjqgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.090363 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jjqgh\" (UniqueName: \"kubernetes.io/projected/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae-kube-api-access-jjqgh\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.403502 4773 generic.go:334] "Generic (PLEG): container finished" podID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" containerID="0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382" exitCode=2 Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.403571 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.403561 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae","Type":"ContainerDied","Data":"0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382"} Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.403892 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae","Type":"ContainerDied","Data":"381de174237c80b24a95594eb30259e92e84f6fa102ffa5688eefcf07e0ea711"} Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.403916 4773 scope.go:117] "RemoveContainer" containerID="0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.434385 4773 scope.go:117] "RemoveContainer" containerID="0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382" Jan 20 18:51:20 crc kubenswrapper[4773]: E0120 18:51:20.435249 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382\": container with ID starting with 0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382 not found: ID does not exist" containerID="0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.435285 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382"} err="failed to get container status \"0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382\": rpc error: code = NotFound desc = could not find container \"0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382\": container with ID starting with 0bdff56d2b0b1edc8aba0e714c028d481d32cddd81d130ee599836ec70c75382 not found: ID does not exist" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.435998 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.436254 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-central-agent" containerID="cri-o://9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086" gracePeriod=30 Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.436566 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="proxy-httpd" containerID="cri-o://4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b" gracePeriod=30 Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.436610 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="sg-core" containerID="cri-o://1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c" gracePeriod=30 Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.436656 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-notification-agent" containerID="cri-o://331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643" gracePeriod=30 Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.470421 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.486189 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.494698 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:51:20 crc kubenswrapper[4773]: E0120 18:51:20.495215 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" containerName="kube-state-metrics" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.495236 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" containerName="kube-state-metrics" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.495463 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" containerName="kube-state-metrics" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.496088 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.502020 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.514968 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.514968 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.597430 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.597539 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.597562 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6rw2t\" (UniqueName: \"kubernetes.io/projected/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-api-access-6rw2t\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.597590 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.698735 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.698873 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.698910 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6rw2t\" (UniqueName: \"kubernetes.io/projected/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-api-access-6rw2t\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.698974 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.703692 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.703717 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.704326 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.717589 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6rw2t\" (UniqueName: \"kubernetes.io/projected/7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea-kube-api-access-6rw2t\") pod \"kube-state-metrics-0\" (UID: \"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea\") " pod="openstack/kube-state-metrics-0" Jan 20 18:51:20 crc kubenswrapper[4773]: I0120 18:51:20.834900 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.309893 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Jan 20 18:51:21 crc kubenswrapper[4773]: W0120 18:51:21.323761 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7e2f1ada_ddef_454d_bdb7_fd695ee8f4ea.slice/crio-c9f8af5b58603c6582bf150d40caa0ee7feacc8f58bafc6f25940a83a573abde WatchSource:0}: Error finding container c9f8af5b58603c6582bf150d40caa0ee7feacc8f58bafc6f25940a83a573abde: Status 404 returned error can't find the container with id c9f8af5b58603c6582bf150d40caa0ee7feacc8f58bafc6f25940a83a573abde Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.414212 4773 generic.go:334] "Generic (PLEG): container finished" podID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerID="4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b" exitCode=0 Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.414241 4773 generic.go:334] "Generic (PLEG): container finished" podID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerID="1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c" exitCode=2 Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.414249 4773 generic.go:334] "Generic (PLEG): container finished" podID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerID="9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086" exitCode=0 Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.414285 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerDied","Data":"4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b"} Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.414477 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerDied","Data":"1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c"} Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.414492 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerDied","Data":"9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086"} Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.415591 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea","Type":"ContainerStarted","Data":"c9f8af5b58603c6582bf150d40caa0ee7feacc8f58bafc6f25940a83a573abde"} Jan 20 18:51:21 crc kubenswrapper[4773]: I0120 18:51:21.457176 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae" path="/var/lib/kubelet/pods/d3923c4a-e72e-4b8e-bdd8-c8f2b7c443ae/volumes" Jan 20 18:51:22 crc kubenswrapper[4773]: I0120 18:51:22.426460 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea","Type":"ContainerStarted","Data":"569a5f5031a6b3558413fdead634a21c6b98e859bb254fb92034bc49c971f93b"} Jan 20 18:51:23 crc kubenswrapper[4773]: I0120 18:51:23.433581 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Jan 20 18:51:23 crc kubenswrapper[4773]: I0120 18:51:23.461106 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=2.811110049 podStartE2EDuration="3.461086931s" podCreationTimestamp="2026-01-20 18:51:20 +0000 UTC" firstStartedPulling="2026-01-20 18:51:21.326125419 +0000 UTC m=+1274.247938443" lastFinishedPulling="2026-01-20 18:51:21.976102301 +0000 UTC m=+1274.897915325" observedRunningTime="2026-01-20 18:51:23.450558088 +0000 UTC m=+1276.372371112" watchObservedRunningTime="2026-01-20 18:51:23.461086931 +0000 UTC m=+1276.382899965" Jan 20 18:51:24 crc kubenswrapper[4773]: I0120 18:51:24.824321 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Jan 20 18:51:24 crc kubenswrapper[4773]: I0120 18:51:24.884189 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 20 18:51:24 crc kubenswrapper[4773]: I0120 18:51:24.910787 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 20 18:51:24 crc kubenswrapper[4773]: I0120 18:51:24.932014 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 20 18:51:24 crc kubenswrapper[4773]: I0120 18:51:24.932082 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.434531 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.473826 4773 generic.go:334] "Generic (PLEG): container finished" podID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerID="331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643" exitCode=0 Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.474650 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.474833 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerDied","Data":"331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643"} Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.474857 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"efb19124-8662-46a5-8fb4-7fbaeba8885a","Type":"ContainerDied","Data":"54c5a63008f7d55fe728b88afe044f2f142c0ed42ff2bb1fe5fb615542c76281"} Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.474872 4773 scope.go:117] "RemoveContainer" containerID="4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.533725 4773 scope.go:117] "RemoveContainer" containerID="1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.534201 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.575607 4773 scope.go:117] "RemoveContainer" containerID="331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.585570 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-config-data\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.585711 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-run-httpd\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.585760 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-combined-ca-bundle\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.585888 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-scripts\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.586257 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.586550 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-sg-core-conf-yaml\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.586604 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rj2gd\" (UniqueName: \"kubernetes.io/projected/efb19124-8662-46a5-8fb4-7fbaeba8885a-kube-api-access-rj2gd\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.586666 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-log-httpd\") pod \"efb19124-8662-46a5-8fb4-7fbaeba8885a\" (UID: \"efb19124-8662-46a5-8fb4-7fbaeba8885a\") " Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.587881 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.588638 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.591220 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efb19124-8662-46a5-8fb4-7fbaeba8885a-kube-api-access-rj2gd" (OuterVolumeSpecName: "kube-api-access-rj2gd") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "kube-api-access-rj2gd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.591318 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-scripts" (OuterVolumeSpecName: "scripts") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.614658 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.614733 4773 scope.go:117] "RemoveContainer" containerID="9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.685609 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-config-data" (OuterVolumeSpecName: "config-data") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.686893 4773 scope.go:117] "RemoveContainer" containerID="4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.687398 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b\": container with ID starting with 4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b not found: ID does not exist" containerID="4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.687444 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b"} err="failed to get container status \"4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b\": rpc error: code = NotFound desc = could not find container \"4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b\": container with ID starting with 4f0d82a5c4812122ac869b3fce3bb56530dce291121a86af809addd34c660b7b not found: ID does not exist" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.687465 4773 scope.go:117] "RemoveContainer" containerID="1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.687830 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c\": container with ID starting with 1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c not found: ID does not exist" containerID="1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.687868 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c"} err="failed to get container status \"1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c\": rpc error: code = NotFound desc = could not find container \"1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c\": container with ID starting with 1bf3529b66753bd46397574712a543ebfe5f08719dd9776591973ff48295495c not found: ID does not exist" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.687887 4773 scope.go:117] "RemoveContainer" containerID="331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.688115 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643\": container with ID starting with 331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643 not found: ID does not exist" containerID="331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.688138 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643"} err="failed to get container status \"331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643\": rpc error: code = NotFound desc = could not find container \"331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643\": container with ID starting with 331cc9c4bf5916907acc17301a73252c053b2c503989e99a3181ee9b8431d643 not found: ID does not exist" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.688159 4773 scope.go:117] "RemoveContainer" containerID="9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.688642 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086\": container with ID starting with 9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086 not found: ID does not exist" containerID="9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.688667 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086"} err="failed to get container status \"9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086\": rpc error: code = NotFound desc = could not find container \"9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086\": container with ID starting with 9d95ab46285f54702bbd86cfcc2a82ccf6dffd16c89bceb57cb3b449d92ca086 not found: ID does not exist" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.690105 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.690144 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rj2gd\" (UniqueName: \"kubernetes.io/projected/efb19124-8662-46a5-8fb4-7fbaeba8885a-kube-api-access-rj2gd\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.690162 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/efb19124-8662-46a5-8fb4-7fbaeba8885a-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.690173 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.690184 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.696520 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "efb19124-8662-46a5-8fb4-7fbaeba8885a" (UID: "efb19124-8662-46a5-8fb4-7fbaeba8885a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.791925 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/efb19124-8662-46a5-8fb4-7fbaeba8885a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.814086 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.822647 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.864745 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.867765 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-notification-agent" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.867871 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-notification-agent" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.868034 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="proxy-httpd" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880244 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="proxy-httpd" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.880338 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-central-agent" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880346 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-central-agent" Jan 20 18:51:25 crc kubenswrapper[4773]: E0120 18:51:25.880366 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="sg-core" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880372 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="sg-core" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880653 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-notification-agent" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880667 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="ceilometer-central-agent" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880677 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="proxy-httpd" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.880695 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" containerName="sg-core" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.882422 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.884436 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.884436 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.884590 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.888439 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.947098 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.947459 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996398 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hhn7\" (UniqueName: \"kubernetes.io/projected/4b106f16-e8b7-4cc5-a5be-fba349150373-kube-api-access-7hhn7\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996453 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-log-httpd\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996476 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996540 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996578 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-config-data\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996625 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-scripts\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996655 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-run-httpd\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:25 crc kubenswrapper[4773]: I0120 18:51:25.996752 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098463 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-scripts\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098516 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-run-httpd\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098624 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098650 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7hhn7\" (UniqueName: \"kubernetes.io/projected/4b106f16-e8b7-4cc5-a5be-fba349150373-kube-api-access-7hhn7\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098672 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-log-httpd\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098689 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098733 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.098762 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-config-data\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.099139 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-run-httpd\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.099428 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-log-httpd\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.102677 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-scripts\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.105771 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.115674 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.115897 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.116282 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-config-data\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.124557 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hhn7\" (UniqueName: \"kubernetes.io/projected/4b106f16-e8b7-4cc5-a5be-fba349150373-kube-api-access-7hhn7\") pod \"ceilometer-0\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.208534 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:26 crc kubenswrapper[4773]: I0120 18:51:26.718716 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:26 crc kubenswrapper[4773]: W0120 18:51:26.726846 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4b106f16_e8b7_4cc5_a5be_fba349150373.slice/crio-f478295b705d02de0b4e43a9ee5250be4bde7baa055afba6f38995533ab32694 WatchSource:0}: Error finding container f478295b705d02de0b4e43a9ee5250be4bde7baa055afba6f38995533ab32694: Status 404 returned error can't find the container with id f478295b705d02de0b4e43a9ee5250be4bde7baa055afba6f38995533ab32694 Jan 20 18:51:27 crc kubenswrapper[4773]: I0120 18:51:27.067065 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:51:27 crc kubenswrapper[4773]: I0120 18:51:27.067103 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:51:27 crc kubenswrapper[4773]: I0120 18:51:27.455720 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efb19124-8662-46a5-8fb4-7fbaeba8885a" path="/var/lib/kubelet/pods/efb19124-8662-46a5-8fb4-7fbaeba8885a/volumes" Jan 20 18:51:27 crc kubenswrapper[4773]: I0120 18:51:27.503217 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerStarted","Data":"9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c"} Jan 20 18:51:27 crc kubenswrapper[4773]: I0120 18:51:27.503270 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerStarted","Data":"f478295b705d02de0b4e43a9ee5250be4bde7baa055afba6f38995533ab32694"} Jan 20 18:51:28 crc kubenswrapper[4773]: I0120 18:51:28.148206 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:28 crc kubenswrapper[4773]: I0120 18:51:28.148232 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.185:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:29 crc kubenswrapper[4773]: I0120 18:51:29.549176 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerStarted","Data":"774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf"} Jan 20 18:51:30 crc kubenswrapper[4773]: I0120 18:51:30.559311 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerStarted","Data":"97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860"} Jan 20 18:51:30 crc kubenswrapper[4773]: I0120 18:51:30.844954 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Jan 20 18:51:31 crc kubenswrapper[4773]: I0120 18:51:31.569803 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerStarted","Data":"656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e"} Jan 20 18:51:31 crc kubenswrapper[4773]: I0120 18:51:31.571147 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:51:31 crc kubenswrapper[4773]: I0120 18:51:31.597334 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.300629086 podStartE2EDuration="6.597314149s" podCreationTimestamp="2026-01-20 18:51:25 +0000 UTC" firstStartedPulling="2026-01-20 18:51:26.730518065 +0000 UTC m=+1279.652331089" lastFinishedPulling="2026-01-20 18:51:31.027203128 +0000 UTC m=+1283.949016152" observedRunningTime="2026-01-20 18:51:31.586454019 +0000 UTC m=+1284.508267043" watchObservedRunningTime="2026-01-20 18:51:31.597314149 +0000 UTC m=+1284.519127173" Jan 20 18:51:34 crc kubenswrapper[4773]: I0120 18:51:34.937894 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 20 18:51:34 crc kubenswrapper[4773]: I0120 18:51:34.938361 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 20 18:51:34 crc kubenswrapper[4773]: I0120 18:51:34.942565 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 20 18:51:34 crc kubenswrapper[4773]: I0120 18:51:34.944038 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.071035 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.072065 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.072638 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.072673 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.075345 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.078574 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.307192 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-n5s7s"] Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.308793 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.338877 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-n5s7s"] Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.410290 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.410371 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.410436 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6sxl\" (UniqueName: \"kubernetes.io/projected/cb02b4c0-80ac-4860-8877-f507f8bc2028-kube-api-access-l6sxl\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.410474 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-config\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.410513 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-dns-svc\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.512962 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6sxl\" (UniqueName: \"kubernetes.io/projected/cb02b4c0-80ac-4860-8877-f507f8bc2028-kube-api-access-l6sxl\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.517344 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-config\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.517400 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-dns-svc\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.517603 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.517659 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.519115 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-config\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.519504 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-sb\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.519802 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-nb\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.525614 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-dns-svc\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.547476 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6sxl\" (UniqueName: \"kubernetes.io/projected/cb02b4c0-80ac-4860-8877-f507f8bc2028-kube-api-access-l6sxl\") pod \"dnsmasq-dns-5b856c5697-n5s7s\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.620674 4773 generic.go:334] "Generic (PLEG): container finished" podID="cc9522a8-e87a-485b-85e6-9548b4f7c835" containerID="49b4d8e189097e49504e7ac626fc2c7b41cd127ac2c715bc575a36d1764b8b4e" exitCode=137 Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.621431 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cc9522a8-e87a-485b-85e6-9548b4f7c835","Type":"ContainerDied","Data":"49b4d8e189097e49504e7ac626fc2c7b41cd127ac2c715bc575a36d1764b8b4e"} Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.636503 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.757020 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.830519 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-combined-ca-bundle\") pod \"cc9522a8-e87a-485b-85e6-9548b4f7c835\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.831191 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-npbs9\" (UniqueName: \"kubernetes.io/projected/cc9522a8-e87a-485b-85e6-9548b4f7c835-kube-api-access-npbs9\") pod \"cc9522a8-e87a-485b-85e6-9548b4f7c835\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.831299 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-config-data\") pod \"cc9522a8-e87a-485b-85e6-9548b4f7c835\" (UID: \"cc9522a8-e87a-485b-85e6-9548b4f7c835\") " Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.838248 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc9522a8-e87a-485b-85e6-9548b4f7c835-kube-api-access-npbs9" (OuterVolumeSpecName: "kube-api-access-npbs9") pod "cc9522a8-e87a-485b-85e6-9548b4f7c835" (UID: "cc9522a8-e87a-485b-85e6-9548b4f7c835"). InnerVolumeSpecName "kube-api-access-npbs9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.875133 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "cc9522a8-e87a-485b-85e6-9548b4f7c835" (UID: "cc9522a8-e87a-485b-85e6-9548b4f7c835"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.891366 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-config-data" (OuterVolumeSpecName: "config-data") pod "cc9522a8-e87a-485b-85e6-9548b4f7c835" (UID: "cc9522a8-e87a-485b-85e6-9548b4f7c835"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.954228 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.954278 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/cc9522a8-e87a-485b-85e6-9548b4f7c835-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:37 crc kubenswrapper[4773]: I0120 18:51:37.954295 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-npbs9\" (UniqueName: \"kubernetes.io/projected/cc9522a8-e87a-485b-85e6-9548b4f7c835-kube-api-access-npbs9\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.090791 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-n5s7s"] Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.630373 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"cc9522a8-e87a-485b-85e6-9548b4f7c835","Type":"ContainerDied","Data":"c91e5fa12c0b7ce3e47c60fb183a0b0468674854812e9596d10c1c95981af1d7"} Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.630425 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.630769 4773 scope.go:117] "RemoveContainer" containerID="49b4d8e189097e49504e7ac626fc2c7b41cd127ac2c715bc575a36d1764b8b4e" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.632037 4773 generic.go:334] "Generic (PLEG): container finished" podID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerID="69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd" exitCode=0 Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.632082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" event={"ID":"cb02b4c0-80ac-4860-8877-f507f8bc2028","Type":"ContainerDied","Data":"69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd"} Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.633426 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" event={"ID":"cb02b4c0-80ac-4860-8877-f507f8bc2028","Type":"ContainerStarted","Data":"d35e4b0ceb4787bad4a95ece01b001352569b9bbc51780aec8a8c24c4fa207e2"} Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.793987 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.802180 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.833167 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:51:38 crc kubenswrapper[4773]: E0120 18:51:38.833704 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc9522a8-e87a-485b-85e6-9548b4f7c835" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.833797 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc9522a8-e87a-485b-85e6-9548b4f7c835" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.834044 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc9522a8-e87a-485b-85e6-9548b4f7c835" containerName="nova-cell1-novncproxy-novncproxy" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.834673 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.837439 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.837508 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.837445 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.846647 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.973309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.973725 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.973778 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.973814 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsppl\" (UniqueName: \"kubernetes.io/projected/7c5b56a3-1c91-4347-ae44-63f05c35e134-kube-api-access-dsppl\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:38 crc kubenswrapper[4773]: I0120 18:51:38.973906 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.075494 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.075554 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.075593 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.075623 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsppl\" (UniqueName: \"kubernetes.io/projected/7c5b56a3-1c91-4347-ae44-63f05c35e134-kube-api-access-dsppl\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.075668 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.081409 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.081465 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.081617 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.083373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c5b56a3-1c91-4347-ae44-63f05c35e134-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.092468 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsppl\" (UniqueName: \"kubernetes.io/projected/7c5b56a3-1c91-4347-ae44-63f05c35e134-kube-api-access-dsppl\") pod \"nova-cell1-novncproxy-0\" (UID: \"7c5b56a3-1c91-4347-ae44-63f05c35e134\") " pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.151527 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.461483 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc9522a8-e87a-485b-85e6-9548b4f7c835" path="/var/lib/kubelet/pods/cc9522a8-e87a-485b-85e6-9548b4f7c835/volumes" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.615105 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Jan 20 18:51:39 crc kubenswrapper[4773]: W0120 18:51:39.616395 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c5b56a3_1c91_4347_ae44_63f05c35e134.slice/crio-58a0561bbd00f574f91b3fa0bc9f124493510e5035f39a846bffc51386fbf23e WatchSource:0}: Error finding container 58a0561bbd00f574f91b3fa0bc9f124493510e5035f39a846bffc51386fbf23e: Status 404 returned error can't find the container with id 58a0561bbd00f574f91b3fa0bc9f124493510e5035f39a846bffc51386fbf23e Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.645073 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" event={"ID":"cb02b4c0-80ac-4860-8877-f507f8bc2028","Type":"ContainerStarted","Data":"ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7"} Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.645311 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.646692 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7c5b56a3-1c91-4347-ae44-63f05c35e134","Type":"ContainerStarted","Data":"58a0561bbd00f574f91b3fa0bc9f124493510e5035f39a846bffc51386fbf23e"} Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.672056 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" podStartSLOduration=2.672037634 podStartE2EDuration="2.672037634s" podCreationTimestamp="2026-01-20 18:51:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:39.661067022 +0000 UTC m=+1292.582880076" watchObservedRunningTime="2026-01-20 18:51:39.672037634 +0000 UTC m=+1292.593850658" Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.738832 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.739045 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-log" containerID="cri-o://b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25" gracePeriod=30 Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.739795 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-api" containerID="cri-o://f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041" gracePeriod=30 Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.932783 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.933299 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-central-agent" containerID="cri-o://9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c" gracePeriod=30 Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.933328 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="proxy-httpd" containerID="cri-o://656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e" gracePeriod=30 Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.933372 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="sg-core" containerID="cri-o://97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860" gracePeriod=30 Jan 20 18:51:39 crc kubenswrapper[4773]: I0120 18:51:39.933404 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-notification-agent" containerID="cri-o://774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf" gracePeriod=30 Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.660044 4773 generic.go:334] "Generic (PLEG): container finished" podID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerID="656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e" exitCode=0 Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.660083 4773 generic.go:334] "Generic (PLEG): container finished" podID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerID="97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860" exitCode=2 Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.660096 4773 generic.go:334] "Generic (PLEG): container finished" podID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerID="9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c" exitCode=0 Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.660135 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerDied","Data":"656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e"} Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.660159 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerDied","Data":"97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860"} Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.660169 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerDied","Data":"9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c"} Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.661894 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"7c5b56a3-1c91-4347-ae44-63f05c35e134","Type":"ContainerStarted","Data":"5b7dd7e03c625a8c1bfca8bb788d4e764c64e7659c84950ff30cc14e669adc77"} Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.665009 4773 generic.go:334] "Generic (PLEG): container finished" podID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerID="b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25" exitCode=143 Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.665078 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f3317e-9e4d-442d-a5b2-d9633262f332","Type":"ContainerDied","Data":"b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25"} Jan 20 18:51:40 crc kubenswrapper[4773]: I0120 18:51:40.681287 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.681270981 podStartE2EDuration="2.681270981s" podCreationTimestamp="2026-01-20 18:51:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:40.678420573 +0000 UTC m=+1293.600233597" watchObservedRunningTime="2026-01-20 18:51:40.681270981 +0000 UTC m=+1293.603084005" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.586631 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.678731 4773 generic.go:334] "Generic (PLEG): container finished" podID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerID="774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf" exitCode=0 Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.678780 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerDied","Data":"774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf"} Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.678810 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.678838 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"4b106f16-e8b7-4cc5-a5be-fba349150373","Type":"ContainerDied","Data":"f478295b705d02de0b4e43a9ee5250be4bde7baa055afba6f38995533ab32694"} Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.678862 4773 scope.go:117] "RemoveContainer" containerID="656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.698823 4773 scope.go:117] "RemoveContainer" containerID="97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.725149 4773 scope.go:117] "RemoveContainer" containerID="774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.747400 4773 scope.go:117] "RemoveContainer" containerID="9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749389 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-config-data\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749495 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-run-httpd\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749552 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-ceilometer-tls-certs\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749609 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-combined-ca-bundle\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749645 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-sg-core-conf-yaml\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749708 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hhn7\" (UniqueName: \"kubernetes.io/projected/4b106f16-e8b7-4cc5-a5be-fba349150373-kube-api-access-7hhn7\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749801 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-log-httpd\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.749829 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-scripts\") pod \"4b106f16-e8b7-4cc5-a5be-fba349150373\" (UID: \"4b106f16-e8b7-4cc5-a5be-fba349150373\") " Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.750238 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.750383 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.750805 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.761190 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-scripts" (OuterVolumeSpecName: "scripts") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.761235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b106f16-e8b7-4cc5-a5be-fba349150373-kube-api-access-7hhn7" (OuterVolumeSpecName: "kube-api-access-7hhn7") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "kube-api-access-7hhn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.807771 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.811010 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.822084 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.854496 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.854527 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.854536 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7hhn7\" (UniqueName: \"kubernetes.io/projected/4b106f16-e8b7-4cc5-a5be-fba349150373-kube-api-access-7hhn7\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.854546 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/4b106f16-e8b7-4cc5-a5be-fba349150373-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.854554 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.854563 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.856717 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-config-data" (OuterVolumeSpecName: "config-data") pod "4b106f16-e8b7-4cc5-a5be-fba349150373" (UID: "4b106f16-e8b7-4cc5-a5be-fba349150373"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.926973 4773 scope.go:117] "RemoveContainer" containerID="656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e" Jan 20 18:51:41 crc kubenswrapper[4773]: E0120 18:51:41.927505 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e\": container with ID starting with 656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e not found: ID does not exist" containerID="656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.927573 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e"} err="failed to get container status \"656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e\": rpc error: code = NotFound desc = could not find container \"656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e\": container with ID starting with 656ef6ae8b50cc2d4e503379e308e9a4f7a7c41991b5b9f249c6f7f975d0899e not found: ID does not exist" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.927614 4773 scope.go:117] "RemoveContainer" containerID="97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860" Jan 20 18:51:41 crc kubenswrapper[4773]: E0120 18:51:41.928023 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860\": container with ID starting with 97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860 not found: ID does not exist" containerID="97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.928074 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860"} err="failed to get container status \"97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860\": rpc error: code = NotFound desc = could not find container \"97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860\": container with ID starting with 97703318fcd07ad3b43017e69f9c12cf98bdaf40302156bd8ae74b9e57732860 not found: ID does not exist" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.928129 4773 scope.go:117] "RemoveContainer" containerID="774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf" Jan 20 18:51:41 crc kubenswrapper[4773]: E0120 18:51:41.928442 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf\": container with ID starting with 774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf not found: ID does not exist" containerID="774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.928481 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf"} err="failed to get container status \"774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf\": rpc error: code = NotFound desc = could not find container \"774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf\": container with ID starting with 774f3b5786f1b1467746155b541a61f7b9bbc3aeb7beb7a87ce5c1f8e0024caf not found: ID does not exist" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.928503 4773 scope.go:117] "RemoveContainer" containerID="9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c" Jan 20 18:51:41 crc kubenswrapper[4773]: E0120 18:51:41.928844 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c\": container with ID starting with 9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c not found: ID does not exist" containerID="9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.928867 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c"} err="failed to get container status \"9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c\": rpc error: code = NotFound desc = could not find container \"9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c\": container with ID starting with 9de41394a05fdc9ab26765ec68b9f8b0371bba9ab8fcbd772a318539f5ba620c not found: ID does not exist" Jan 20 18:51:41 crc kubenswrapper[4773]: I0120 18:51:41.956309 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b106f16-e8b7-4cc5-a5be-fba349150373-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.013400 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.025530 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.035433 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:42 crc kubenswrapper[4773]: E0120 18:51:42.035794 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-central-agent" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.035814 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-central-agent" Jan 20 18:51:42 crc kubenswrapper[4773]: E0120 18:51:42.035831 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="sg-core" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.035838 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="sg-core" Jan 20 18:51:42 crc kubenswrapper[4773]: E0120 18:51:42.035850 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-notification-agent" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.035856 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-notification-agent" Jan 20 18:51:42 crc kubenswrapper[4773]: E0120 18:51:42.035870 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="proxy-httpd" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.035875 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="proxy-httpd" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.036122 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-notification-agent" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.036142 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="ceilometer-central-agent" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.036150 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="sg-core" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.036172 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" containerName="proxy-httpd" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.039481 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.044497 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.044525 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.044767 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.045965 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161437 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-log-httpd\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161486 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161562 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-run-httpd\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161671 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161719 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhs6b\" (UniqueName: \"kubernetes.io/projected/020e7117-149f-4a0d-aa81-a324df9db850-kube-api-access-qhs6b\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161844 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.161988 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-config-data\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.162168 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-scripts\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.263890 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-scripts\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264305 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-log-httpd\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264343 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264477 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-run-httpd\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264523 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264547 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhs6b\" (UniqueName: \"kubernetes.io/projected/020e7117-149f-4a0d-aa81-a324df9db850-kube-api-access-qhs6b\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.264634 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-config-data\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.265641 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-log-httpd\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.266017 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-run-httpd\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.269390 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.269709 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-config-data\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.271397 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.271410 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.271540 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-scripts\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.286448 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhs6b\" (UniqueName: \"kubernetes.io/projected/020e7117-149f-4a0d-aa81-a324df9db850-kube-api-access-qhs6b\") pod \"ceilometer-0\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.359120 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 18:51:42 crc kubenswrapper[4773]: I0120 18:51:42.799261 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.365784 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.466276 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b106f16-e8b7-4cc5-a5be-fba349150373" path="/var/lib/kubelet/pods/4b106f16-e8b7-4cc5-a5be-fba349150373/volumes" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.495298 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3317e-9e4d-442d-a5b2-d9633262f332-logs\") pod \"79f3317e-9e4d-442d-a5b2-d9633262f332\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.495389 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-smh5x\" (UniqueName: \"kubernetes.io/projected/79f3317e-9e4d-442d-a5b2-d9633262f332-kube-api-access-smh5x\") pod \"79f3317e-9e4d-442d-a5b2-d9633262f332\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.495456 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-combined-ca-bundle\") pod \"79f3317e-9e4d-442d-a5b2-d9633262f332\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.495474 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-config-data\") pod \"79f3317e-9e4d-442d-a5b2-d9633262f332\" (UID: \"79f3317e-9e4d-442d-a5b2-d9633262f332\") " Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.496679 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/79f3317e-9e4d-442d-a5b2-d9633262f332-logs" (OuterVolumeSpecName: "logs") pod "79f3317e-9e4d-442d-a5b2-d9633262f332" (UID: "79f3317e-9e4d-442d-a5b2-d9633262f332"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.500832 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/79f3317e-9e4d-442d-a5b2-d9633262f332-kube-api-access-smh5x" (OuterVolumeSpecName: "kube-api-access-smh5x") pod "79f3317e-9e4d-442d-a5b2-d9633262f332" (UID: "79f3317e-9e4d-442d-a5b2-d9633262f332"). InnerVolumeSpecName "kube-api-access-smh5x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.524262 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-config-data" (OuterVolumeSpecName: "config-data") pod "79f3317e-9e4d-442d-a5b2-d9633262f332" (UID: "79f3317e-9e4d-442d-a5b2-d9633262f332"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.536162 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "79f3317e-9e4d-442d-a5b2-d9633262f332" (UID: "79f3317e-9e4d-442d-a5b2-d9633262f332"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.597080 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/79f3317e-9e4d-442d-a5b2-d9633262f332-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.597109 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-smh5x\" (UniqueName: \"kubernetes.io/projected/79f3317e-9e4d-442d-a5b2-d9633262f332-kube-api-access-smh5x\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.597122 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.597131 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/79f3317e-9e4d-442d-a5b2-d9633262f332-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.697696 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerStarted","Data":"37639be04fa67337c4343ed582b7cba817862498c8f653737abe8b0ad324ee80"} Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.697743 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerStarted","Data":"69446a8d0d2a42de6e148590cfbcb0a1f5f08dfbfef8edbc94698b1b5257bf49"} Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.700086 4773 generic.go:334] "Generic (PLEG): container finished" podID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerID="f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041" exitCode=0 Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.700136 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f3317e-9e4d-442d-a5b2-d9633262f332","Type":"ContainerDied","Data":"f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041"} Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.700152 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"79f3317e-9e4d-442d-a5b2-d9633262f332","Type":"ContainerDied","Data":"4073b0927686969a0c5927a5f8897ae5c028636891b4d7b257d94400258ebb80"} Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.700162 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.700168 4773 scope.go:117] "RemoveContainer" containerID="f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.720749 4773 scope.go:117] "RemoveContainer" containerID="b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.727351 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.735595 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.751807 4773 scope.go:117] "RemoveContainer" containerID="f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041" Jan 20 18:51:43 crc kubenswrapper[4773]: E0120 18:51:43.752292 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041\": container with ID starting with f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041 not found: ID does not exist" containerID="f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.752336 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041"} err="failed to get container status \"f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041\": rpc error: code = NotFound desc = could not find container \"f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041\": container with ID starting with f3ba4f9c861ea417ded7870354bf000e38b9d7e757013613b836d589b01ab041 not found: ID does not exist" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.752363 4773 scope.go:117] "RemoveContainer" containerID="b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25" Jan 20 18:51:43 crc kubenswrapper[4773]: E0120 18:51:43.753120 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25\": container with ID starting with b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25 not found: ID does not exist" containerID="b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.753141 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25"} err="failed to get container status \"b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25\": rpc error: code = NotFound desc = could not find container \"b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25\": container with ID starting with b5a704a91d916bab76c0bb6a0b9b2f69a36e06e85e9f34c52f172ada992c6f25 not found: ID does not exist" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.760140 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:43 crc kubenswrapper[4773]: E0120 18:51:43.760628 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-api" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.760650 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-api" Jan 20 18:51:43 crc kubenswrapper[4773]: E0120 18:51:43.760691 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-log" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.760700 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-log" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.760916 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-log" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.760959 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" containerName="nova-api-api" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.761884 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.767879 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.776283 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.776913 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.777161 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.903121 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dac51db-1574-4ccc-bb9a-7c42548d90d3-logs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.903200 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-config-data\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.903436 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-public-tls-certs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.903564 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.903620 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:43 crc kubenswrapper[4773]: I0120 18:51:43.903683 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmtxc\" (UniqueName: \"kubernetes.io/projected/2dac51db-1574-4ccc-bb9a-7c42548d90d3-kube-api-access-cmtxc\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.005187 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-public-tls-certs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.005543 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.005571 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.005602 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cmtxc\" (UniqueName: \"kubernetes.io/projected/2dac51db-1574-4ccc-bb9a-7c42548d90d3-kube-api-access-cmtxc\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.005636 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dac51db-1574-4ccc-bb9a-7c42548d90d3-logs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.005672 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-config-data\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.006218 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dac51db-1574-4ccc-bb9a-7c42548d90d3-logs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.009643 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-config-data\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.009679 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.010110 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-public-tls-certs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.010397 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-internal-tls-certs\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.026275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmtxc\" (UniqueName: \"kubernetes.io/projected/2dac51db-1574-4ccc-bb9a-7c42548d90d3-kube-api-access-cmtxc\") pod \"nova-api-0\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.089837 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.152573 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.594269 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.717837 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dac51db-1574-4ccc-bb9a-7c42548d90d3","Type":"ContainerStarted","Data":"be5b8861809388d82da03a233f610c1c0601dcd3da676a63a2eea6fbb34fe0ac"} Jan 20 18:51:44 crc kubenswrapper[4773]: I0120 18:51:44.721448 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerStarted","Data":"52e37d56e55649463d21a5ee03a0affc59f9b6f7acea87d0e99f08561b5bb305"} Jan 20 18:51:45 crc kubenswrapper[4773]: I0120 18:51:45.460492 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="79f3317e-9e4d-442d-a5b2-d9633262f332" path="/var/lib/kubelet/pods/79f3317e-9e4d-442d-a5b2-d9633262f332/volumes" Jan 20 18:51:45 crc kubenswrapper[4773]: I0120 18:51:45.735073 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dac51db-1574-4ccc-bb9a-7c42548d90d3","Type":"ContainerStarted","Data":"344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a"} Jan 20 18:51:45 crc kubenswrapper[4773]: I0120 18:51:45.735324 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dac51db-1574-4ccc-bb9a-7c42548d90d3","Type":"ContainerStarted","Data":"5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660"} Jan 20 18:51:45 crc kubenswrapper[4773]: I0120 18:51:45.738666 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerStarted","Data":"f09054a284067e5b25f62a6df3ceace720d880ef94266ed85a89680db580bec0"} Jan 20 18:51:45 crc kubenswrapper[4773]: I0120 18:51:45.763552 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.7635358979999998 podStartE2EDuration="2.763535898s" podCreationTimestamp="2026-01-20 18:51:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:45.755353683 +0000 UTC m=+1298.677166717" watchObservedRunningTime="2026-01-20 18:51:45.763535898 +0000 UTC m=+1298.685348922" Jan 20 18:51:46 crc kubenswrapper[4773]: I0120 18:51:46.766069 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerStarted","Data":"6a115b4e7aa3521d1dcf6d1b5cdf12eb1355072e7aae73459e22f14c9325fda6"} Jan 20 18:51:46 crc kubenswrapper[4773]: I0120 18:51:46.766720 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 18:51:46 crc kubenswrapper[4773]: I0120 18:51:46.793184 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=1.207076082 podStartE2EDuration="4.793165353s" podCreationTimestamp="2026-01-20 18:51:42 +0000 UTC" firstStartedPulling="2026-01-20 18:51:42.80452896 +0000 UTC m=+1295.726341984" lastFinishedPulling="2026-01-20 18:51:46.390618221 +0000 UTC m=+1299.312431255" observedRunningTime="2026-01-20 18:51:46.786729489 +0000 UTC m=+1299.708542523" watchObservedRunningTime="2026-01-20 18:51:46.793165353 +0000 UTC m=+1299.714978377" Jan 20 18:51:47 crc kubenswrapper[4773]: I0120 18:51:47.638147 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:51:47 crc kubenswrapper[4773]: I0120 18:51:47.702403 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-62h5v"] Jan 20 18:51:47 crc kubenswrapper[4773]: I0120 18:51:47.702768 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerName="dnsmasq-dns" containerID="cri-o://19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1" gracePeriod=10 Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.202785 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.274135 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-dns-svc\") pod \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.274225 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-sb\") pod \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.274309 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-config\") pod \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.274429 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zmxnz\" (UniqueName: \"kubernetes.io/projected/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-kube-api-access-zmxnz\") pod \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.274458 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-nb\") pod \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\" (UID: \"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc\") " Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.285724 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-kube-api-access-zmxnz" (OuterVolumeSpecName: "kube-api-access-zmxnz") pod "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" (UID: "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc"). InnerVolumeSpecName "kube-api-access-zmxnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.330768 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" (UID: "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.331711 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-config" (OuterVolumeSpecName: "config") pod "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" (UID: "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.333735 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" (UID: "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.339271 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" (UID: "b01dd3cf-55b3-4e05-a75d-be2ae325b5fc"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.376672 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zmxnz\" (UniqueName: \"kubernetes.io/projected/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-kube-api-access-zmxnz\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.376893 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.377013 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.377111 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.377182 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.788006 4773 generic.go:334] "Generic (PLEG): container finished" podID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerID="19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1" exitCode=0 Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.788089 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.788109 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" event={"ID":"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc","Type":"ContainerDied","Data":"19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1"} Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.788518 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-566b5b7845-62h5v" event={"ID":"b01dd3cf-55b3-4e05-a75d-be2ae325b5fc","Type":"ContainerDied","Data":"a26dde9426d7bf6401c218a90f41c5f6ca8484f70a01b6ffde63301f200a5f7e"} Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.788542 4773 scope.go:117] "RemoveContainer" containerID="19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.827921 4773 scope.go:117] "RemoveContainer" containerID="deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.834464 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-62h5v"] Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.841918 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-566b5b7845-62h5v"] Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.862208 4773 scope.go:117] "RemoveContainer" containerID="19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1" Jan 20 18:51:48 crc kubenswrapper[4773]: E0120 18:51:48.862622 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1\": container with ID starting with 19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1 not found: ID does not exist" containerID="19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.862669 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1"} err="failed to get container status \"19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1\": rpc error: code = NotFound desc = could not find container \"19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1\": container with ID starting with 19b23825d4fe08e169e8236984b550734270fc9494468543004b6e47ef80eed1 not found: ID does not exist" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.862695 4773 scope.go:117] "RemoveContainer" containerID="deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263" Jan 20 18:51:48 crc kubenswrapper[4773]: E0120 18:51:48.862895 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263\": container with ID starting with deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263 not found: ID does not exist" containerID="deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263" Jan 20 18:51:48 crc kubenswrapper[4773]: I0120 18:51:48.862984 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263"} err="failed to get container status \"deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263\": rpc error: code = NotFound desc = could not find container \"deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263\": container with ID starting with deee03e78951a636e93f428533e6ccb2c3506024586cb7ed18161abe124ff263 not found: ID does not exist" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.152462 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.169018 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.456547 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" path="/var/lib/kubelet/pods/b01dd3cf-55b3-4e05-a75d-be2ae325b5fc/volumes" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.817330 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.964177 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-b4hvr"] Jan 20 18:51:49 crc kubenswrapper[4773]: E0120 18:51:49.964666 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerName="dnsmasq-dns" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.964689 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerName="dnsmasq-dns" Jan 20 18:51:49 crc kubenswrapper[4773]: E0120 18:51:49.964714 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerName="init" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.964723 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerName="init" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.964960 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b01dd3cf-55b3-4e05-a75d-be2ae325b5fc" containerName="dnsmasq-dns" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.965771 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.971331 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.973224 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Jan 20 18:51:49 crc kubenswrapper[4773]: I0120 18:51:49.977959 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-b4hvr"] Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.107884 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-config-data\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.108470 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2krp\" (UniqueName: \"kubernetes.io/projected/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-kube-api-access-s2krp\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.108570 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.108633 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-scripts\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.210277 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-config-data\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.210412 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2krp\" (UniqueName: \"kubernetes.io/projected/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-kube-api-access-s2krp\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.210453 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.210483 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-scripts\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.217526 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-config-data\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.219528 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.225683 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-scripts\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.238658 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2krp\" (UniqueName: \"kubernetes.io/projected/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-kube-api-access-s2krp\") pod \"nova-cell1-cell-mapping-b4hvr\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.295675 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.709409 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-b4hvr"] Jan 20 18:51:50 crc kubenswrapper[4773]: W0120 18:51:50.711221 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod24c1bc90_8fe0_41b4_a7ba_7e15bc787386.slice/crio-9a25eec4366d91b90b65b99e42e325114274745c1c71f7a67903dc1d6c3a9bad WatchSource:0}: Error finding container 9a25eec4366d91b90b65b99e42e325114274745c1c71f7a67903dc1d6c3a9bad: Status 404 returned error can't find the container with id 9a25eec4366d91b90b65b99e42e325114274745c1c71f7a67903dc1d6c3a9bad Jan 20 18:51:50 crc kubenswrapper[4773]: I0120 18:51:50.806475 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-b4hvr" event={"ID":"24c1bc90-8fe0-41b4-a7ba-7e15bc787386","Type":"ContainerStarted","Data":"9a25eec4366d91b90b65b99e42e325114274745c1c71f7a67903dc1d6c3a9bad"} Jan 20 18:51:51 crc kubenswrapper[4773]: I0120 18:51:51.815137 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-b4hvr" event={"ID":"24c1bc90-8fe0-41b4-a7ba-7e15bc787386","Type":"ContainerStarted","Data":"7292aa755477db4cf6213001ef3f22ae8bf8250a6e1509481020c0348c2c81e0"} Jan 20 18:51:51 crc kubenswrapper[4773]: I0120 18:51:51.852401 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-b4hvr" podStartSLOduration=2.85238389 podStartE2EDuration="2.85238389s" podCreationTimestamp="2026-01-20 18:51:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:51:51.844575374 +0000 UTC m=+1304.766388418" watchObservedRunningTime="2026-01-20 18:51:51.85238389 +0000 UTC m=+1304.774196914" Jan 20 18:51:54 crc kubenswrapper[4773]: I0120 18:51:54.091075 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:51:54 crc kubenswrapper[4773]: I0120 18:51:54.091563 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:51:55 crc kubenswrapper[4773]: I0120 18:51:55.106158 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.191:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:55 crc kubenswrapper[4773]: I0120 18:51:55.106177 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.191:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:51:56 crc kubenswrapper[4773]: I0120 18:51:56.864258 4773 generic.go:334] "Generic (PLEG): container finished" podID="24c1bc90-8fe0-41b4-a7ba-7e15bc787386" containerID="7292aa755477db4cf6213001ef3f22ae8bf8250a6e1509481020c0348c2c81e0" exitCode=0 Jan 20 18:51:56 crc kubenswrapper[4773]: I0120 18:51:56.864308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-b4hvr" event={"ID":"24c1bc90-8fe0-41b4-a7ba-7e15bc787386","Type":"ContainerDied","Data":"7292aa755477db4cf6213001ef3f22ae8bf8250a6e1509481020c0348c2c81e0"} Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.169712 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.170135 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.220547 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.358135 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2krp\" (UniqueName: \"kubernetes.io/projected/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-kube-api-access-s2krp\") pod \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.358484 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-config-data\") pod \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.358694 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-combined-ca-bundle\") pod \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.359411 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-scripts\") pod \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\" (UID: \"24c1bc90-8fe0-41b4-a7ba-7e15bc787386\") " Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.364847 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-scripts" (OuterVolumeSpecName: "scripts") pod "24c1bc90-8fe0-41b4-a7ba-7e15bc787386" (UID: "24c1bc90-8fe0-41b4-a7ba-7e15bc787386"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.365159 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-kube-api-access-s2krp" (OuterVolumeSpecName: "kube-api-access-s2krp") pod "24c1bc90-8fe0-41b4-a7ba-7e15bc787386" (UID: "24c1bc90-8fe0-41b4-a7ba-7e15bc787386"). InnerVolumeSpecName "kube-api-access-s2krp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.389136 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "24c1bc90-8fe0-41b4-a7ba-7e15bc787386" (UID: "24c1bc90-8fe0-41b4-a7ba-7e15bc787386"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.393472 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-config-data" (OuterVolumeSpecName: "config-data") pod "24c1bc90-8fe0-41b4-a7ba-7e15bc787386" (UID: "24c1bc90-8fe0-41b4-a7ba-7e15bc787386"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.462089 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2krp\" (UniqueName: \"kubernetes.io/projected/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-kube-api-access-s2krp\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.462380 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.462448 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.462501 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/24c1bc90-8fe0-41b4-a7ba-7e15bc787386-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.880261 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-b4hvr" event={"ID":"24c1bc90-8fe0-41b4-a7ba-7e15bc787386","Type":"ContainerDied","Data":"9a25eec4366d91b90b65b99e42e325114274745c1c71f7a67903dc1d6c3a9bad"} Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.880304 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9a25eec4366d91b90b65b99e42e325114274745c1c71f7a67903dc1d6c3a9bad" Jan 20 18:51:58 crc kubenswrapper[4773]: I0120 18:51:58.880581 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-b4hvr" Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.070564 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.071162 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-log" containerID="cri-o://5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660" gracePeriod=30 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.071247 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-api" containerID="cri-o://344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a" gracePeriod=30 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.083992 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.084261 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="8a5e1be7-c022-49b6-aa10-d23451918579" containerName="nova-scheduler-scheduler" containerID="cri-o://52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b" gracePeriod=30 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.122044 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.122309 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-log" containerID="cri-o://e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d" gracePeriod=30 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.122457 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-metadata" containerID="cri-o://f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14" gracePeriod=30 Jan 20 18:51:59 crc kubenswrapper[4773]: E0120 18:51:59.886754 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b is running failed: container process not found" containerID="52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 18:51:59 crc kubenswrapper[4773]: E0120 18:51:59.887431 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b is running failed: container process not found" containerID="52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 18:51:59 crc kubenswrapper[4773]: E0120 18:51:59.887911 4773 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b is running failed: container process not found" containerID="52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Jan 20 18:51:59 crc kubenswrapper[4773]: E0120 18:51:59.888033 4773 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b is running failed: container process not found" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="8a5e1be7-c022-49b6-aa10-d23451918579" containerName="nova-scheduler-scheduler" Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.892087 4773 generic.go:334] "Generic (PLEG): container finished" podID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerID="e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d" exitCode=143 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.892159 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb9c6096-2ce8-4b43-a638-50374d21d621","Type":"ContainerDied","Data":"e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d"} Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.895676 4773 generic.go:334] "Generic (PLEG): container finished" podID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerID="5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660" exitCode=143 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.895765 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dac51db-1574-4ccc-bb9a-7c42548d90d3","Type":"ContainerDied","Data":"5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660"} Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.898323 4773 generic.go:334] "Generic (PLEG): container finished" podID="8a5e1be7-c022-49b6-aa10-d23451918579" containerID="52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b" exitCode=0 Jan 20 18:51:59 crc kubenswrapper[4773]: I0120 18:51:59.898373 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a5e1be7-c022-49b6-aa10-d23451918579","Type":"ContainerDied","Data":"52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b"} Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.091595 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.193835 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n4rnl\" (UniqueName: \"kubernetes.io/projected/8a5e1be7-c022-49b6-aa10-d23451918579-kube-api-access-n4rnl\") pod \"8a5e1be7-c022-49b6-aa10-d23451918579\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.193917 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-config-data\") pod \"8a5e1be7-c022-49b6-aa10-d23451918579\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.194034 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-combined-ca-bundle\") pod \"8a5e1be7-c022-49b6-aa10-d23451918579\" (UID: \"8a5e1be7-c022-49b6-aa10-d23451918579\") " Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.203607 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a5e1be7-c022-49b6-aa10-d23451918579-kube-api-access-n4rnl" (OuterVolumeSpecName: "kube-api-access-n4rnl") pod "8a5e1be7-c022-49b6-aa10-d23451918579" (UID: "8a5e1be7-c022-49b6-aa10-d23451918579"). InnerVolumeSpecName "kube-api-access-n4rnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.222683 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8a5e1be7-c022-49b6-aa10-d23451918579" (UID: "8a5e1be7-c022-49b6-aa10-d23451918579"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.227153 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-config-data" (OuterVolumeSpecName: "config-data") pod "8a5e1be7-c022-49b6-aa10-d23451918579" (UID: "8a5e1be7-c022-49b6-aa10-d23451918579"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.296293 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n4rnl\" (UniqueName: \"kubernetes.io/projected/8a5e1be7-c022-49b6-aa10-d23451918579-kube-api-access-n4rnl\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.296331 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.296341 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8a5e1be7-c022-49b6-aa10-d23451918579-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.908830 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8a5e1be7-c022-49b6-aa10-d23451918579","Type":"ContainerDied","Data":"220ce0272e6c403af39eafd69462c79884cab6d380e200929ec876ec12a03a06"} Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.908905 4773 scope.go:117] "RemoveContainer" containerID="52d885f9b8206758ecad4b33ca14687b40c94ac98c6714194249d8db0d6df79b" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.908908 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.940278 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.948197 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.959473 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:52:00 crc kubenswrapper[4773]: E0120 18:52:00.959863 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8a5e1be7-c022-49b6-aa10-d23451918579" containerName="nova-scheduler-scheduler" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.959884 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a5e1be7-c022-49b6-aa10-d23451918579" containerName="nova-scheduler-scheduler" Jan 20 18:52:00 crc kubenswrapper[4773]: E0120 18:52:00.959896 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24c1bc90-8fe0-41b4-a7ba-7e15bc787386" containerName="nova-manage" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.959903 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="24c1bc90-8fe0-41b4-a7ba-7e15bc787386" containerName="nova-manage" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.960098 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8a5e1be7-c022-49b6-aa10-d23451918579" containerName="nova-scheduler-scheduler" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.960130 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="24c1bc90-8fe0-41b4-a7ba-7e15bc787386" containerName="nova-manage" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.960732 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.967361 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Jan 20 18:52:00 crc kubenswrapper[4773]: I0120 18:52:00.977671 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.111915 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx7w9\" (UniqueName: \"kubernetes.io/projected/8413ef33-749f-4413-9965-fd19ad70ebfc-kube-api-access-qx7w9\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.111997 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8413ef33-749f-4413-9965-fd19ad70ebfc-config-data\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.112059 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8413ef33-749f-4413-9965-fd19ad70ebfc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.213752 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qx7w9\" (UniqueName: \"kubernetes.io/projected/8413ef33-749f-4413-9965-fd19ad70ebfc-kube-api-access-qx7w9\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.216701 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8413ef33-749f-4413-9965-fd19ad70ebfc-config-data\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.216815 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8413ef33-749f-4413-9965-fd19ad70ebfc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.228086 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8413ef33-749f-4413-9965-fd19ad70ebfc-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.229263 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8413ef33-749f-4413-9965-fd19ad70ebfc-config-data\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.230996 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx7w9\" (UniqueName: \"kubernetes.io/projected/8413ef33-749f-4413-9965-fd19ad70ebfc-kube-api-access-qx7w9\") pod \"nova-scheduler-0\" (UID: \"8413ef33-749f-4413-9965-fd19ad70ebfc\") " pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.283191 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.459454 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a5e1be7-c022-49b6-aa10-d23451918579" path="/var/lib/kubelet/pods/8a5e1be7-c022-49b6-aa10-d23451918579/volumes" Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.731135 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.922500 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8413ef33-749f-4413-9965-fd19ad70ebfc","Type":"ContainerStarted","Data":"56d14f0a4aa486f66422e52f4881e665155c3c059013618e9770c8ffd9037759"} Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.922546 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"8413ef33-749f-4413-9965-fd19ad70ebfc","Type":"ContainerStarted","Data":"ff098fe1dd4f3d2186d0173403cbe18aab23828ba5d4ac096d781bffd65b8b9b"} Jan 20 18:52:01 crc kubenswrapper[4773]: I0120 18:52:01.938008 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.937993439 podStartE2EDuration="1.937993439s" podCreationTimestamp="2026-01-20 18:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:52:01.936807331 +0000 UTC m=+1314.858620355" watchObservedRunningTime="2026-01-20 18:52:01.937993439 +0000 UTC m=+1314.859806463" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.259017 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": read tcp 10.217.0.2:35564->10.217.0.184:8775: read: connection reset by peer" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.259061 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.184:8775/\": read tcp 10.217.0.2:35576->10.217.0.184:8775: read: connection reset by peer" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.640721 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.646779 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747020 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9c6096-2ce8-4b43-a638-50374d21d621-logs\") pod \"fb9c6096-2ce8-4b43-a638-50374d21d621\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747076 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-config-data\") pod \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747095 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-combined-ca-bundle\") pod \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747124 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cmtxc\" (UniqueName: \"kubernetes.io/projected/2dac51db-1574-4ccc-bb9a-7c42548d90d3-kube-api-access-cmtxc\") pod \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747166 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dac51db-1574-4ccc-bb9a-7c42548d90d3-logs\") pod \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747209 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-internal-tls-certs\") pod \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747252 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-combined-ca-bundle\") pod \"fb9c6096-2ce8-4b43-a638-50374d21d621\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747313 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-nova-metadata-tls-certs\") pod \"fb9c6096-2ce8-4b43-a638-50374d21d621\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747375 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tgw2z\" (UniqueName: \"kubernetes.io/projected/fb9c6096-2ce8-4b43-a638-50374d21d621-kube-api-access-tgw2z\") pod \"fb9c6096-2ce8-4b43-a638-50374d21d621\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747416 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-config-data\") pod \"fb9c6096-2ce8-4b43-a638-50374d21d621\" (UID: \"fb9c6096-2ce8-4b43-a638-50374d21d621\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747509 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-public-tls-certs\") pod \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\" (UID: \"2dac51db-1574-4ccc-bb9a-7c42548d90d3\") " Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.747910 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2dac51db-1574-4ccc-bb9a-7c42548d90d3-logs" (OuterVolumeSpecName: "logs") pod "2dac51db-1574-4ccc-bb9a-7c42548d90d3" (UID: "2dac51db-1574-4ccc-bb9a-7c42548d90d3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.748234 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb9c6096-2ce8-4b43-a638-50374d21d621-logs" (OuterVolumeSpecName: "logs") pod "fb9c6096-2ce8-4b43-a638-50374d21d621" (UID: "fb9c6096-2ce8-4b43-a638-50374d21d621"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.753811 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb9c6096-2ce8-4b43-a638-50374d21d621-kube-api-access-tgw2z" (OuterVolumeSpecName: "kube-api-access-tgw2z") pod "fb9c6096-2ce8-4b43-a638-50374d21d621" (UID: "fb9c6096-2ce8-4b43-a638-50374d21d621"). InnerVolumeSpecName "kube-api-access-tgw2z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.755357 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2dac51db-1574-4ccc-bb9a-7c42548d90d3-kube-api-access-cmtxc" (OuterVolumeSpecName: "kube-api-access-cmtxc") pod "2dac51db-1574-4ccc-bb9a-7c42548d90d3" (UID: "2dac51db-1574-4ccc-bb9a-7c42548d90d3"). InnerVolumeSpecName "kube-api-access-cmtxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.779670 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-config-data" (OuterVolumeSpecName: "config-data") pod "2dac51db-1574-4ccc-bb9a-7c42548d90d3" (UID: "2dac51db-1574-4ccc-bb9a-7c42548d90d3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.784122 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-config-data" (OuterVolumeSpecName: "config-data") pod "fb9c6096-2ce8-4b43-a638-50374d21d621" (UID: "fb9c6096-2ce8-4b43-a638-50374d21d621"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.795149 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb9c6096-2ce8-4b43-a638-50374d21d621" (UID: "fb9c6096-2ce8-4b43-a638-50374d21d621"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.798896 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2dac51db-1574-4ccc-bb9a-7c42548d90d3" (UID: "2dac51db-1574-4ccc-bb9a-7c42548d90d3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.809875 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "2dac51db-1574-4ccc-bb9a-7c42548d90d3" (UID: "2dac51db-1574-4ccc-bb9a-7c42548d90d3"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.810318 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "fb9c6096-2ce8-4b43-a638-50374d21d621" (UID: "fb9c6096-2ce8-4b43-a638-50374d21d621"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.813021 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "2dac51db-1574-4ccc-bb9a-7c42548d90d3" (UID: "2dac51db-1574-4ccc-bb9a-7c42548d90d3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850872 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2dac51db-1574-4ccc-bb9a-7c42548d90d3-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850908 4773 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850923 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850955 4773 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850965 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tgw2z\" (UniqueName: \"kubernetes.io/projected/fb9c6096-2ce8-4b43-a638-50374d21d621-kube-api-access-tgw2z\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850973 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fb9c6096-2ce8-4b43-a638-50374d21d621-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850984 4773 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.850995 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fb9c6096-2ce8-4b43-a638-50374d21d621-logs\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.851004 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.851012 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2dac51db-1574-4ccc-bb9a-7c42548d90d3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.851021 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cmtxc\" (UniqueName: \"kubernetes.io/projected/2dac51db-1574-4ccc-bb9a-7c42548d90d3-kube-api-access-cmtxc\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.934056 4773 generic.go:334] "Generic (PLEG): container finished" podID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerID="344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a" exitCode=0 Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.934124 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dac51db-1574-4ccc-bb9a-7c42548d90d3","Type":"ContainerDied","Data":"344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a"} Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.934157 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2dac51db-1574-4ccc-bb9a-7c42548d90d3","Type":"ContainerDied","Data":"be5b8861809388d82da03a233f610c1c0601dcd3da676a63a2eea6fbb34fe0ac"} Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.934176 4773 scope.go:117] "RemoveContainer" containerID="344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.934370 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.939777 4773 generic.go:334] "Generic (PLEG): container finished" podID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerID="f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14" exitCode=0 Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.940681 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.942218 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb9c6096-2ce8-4b43-a638-50374d21d621","Type":"ContainerDied","Data":"f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14"} Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.942293 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"fb9c6096-2ce8-4b43-a638-50374d21d621","Type":"ContainerDied","Data":"1fbd29fc0590b0cd2b67866d481ca79cacd0b7e614ff050ba774f3244fed5e4a"} Jan 20 18:52:02 crc kubenswrapper[4773]: I0120 18:52:02.969466 4773 scope.go:117] "RemoveContainer" containerID="5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.001452 4773 scope.go:117] "RemoveContainer" containerID="344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.004181 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.018443 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a\": container with ID starting with 344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a not found: ID does not exist" containerID="344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.018516 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a"} err="failed to get container status \"344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a\": rpc error: code = NotFound desc = could not find container \"344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a\": container with ID starting with 344e5ae008f07185d952898ce38b939b918d5bd6e65f28e4bdd85c43dd0ace0a not found: ID does not exist" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.018572 4773 scope.go:117] "RemoveContainer" containerID="5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660" Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.019581 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660\": container with ID starting with 5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660 not found: ID does not exist" containerID="5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.019638 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660"} err="failed to get container status \"5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660\": rpc error: code = NotFound desc = could not find container \"5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660\": container with ID starting with 5a9203dd4f0d1f79bea5f9b8f067d743dd5e44cee0af5f4f652de389aba6a660 not found: ID does not exist" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.019656 4773 scope.go:117] "RemoveContainer" containerID="f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.035029 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.052707 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.055761 4773 scope.go:117] "RemoveContainer" containerID="e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.061899 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.062356 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-api" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062375 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-api" Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.062385 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-log" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062391 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-log" Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.062404 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-metadata" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062410 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-metadata" Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.062437 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-log" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062443 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-log" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062596 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-log" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062609 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-log" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062615 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" containerName="nova-metadata-metadata" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.062632 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" containerName="nova-api-api" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.063709 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.066136 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.066151 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.066558 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.077448 4773 scope.go:117] "RemoveContainer" containerID="f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14" Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.080694 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14\": container with ID starting with f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14 not found: ID does not exist" containerID="f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.080748 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14"} err="failed to get container status \"f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14\": rpc error: code = NotFound desc = could not find container \"f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14\": container with ID starting with f9cd4e24c3b3532622a9b2a6bbadad7f6004a3ed29c6f5eb40b55e5b14709d14 not found: ID does not exist" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.080782 4773 scope.go:117] "RemoveContainer" containerID="e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d" Jan 20 18:52:03 crc kubenswrapper[4773]: E0120 18:52:03.081181 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d\": container with ID starting with e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d not found: ID does not exist" containerID="e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.081206 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d"} err="failed to get container status \"e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d\": rpc error: code = NotFound desc = could not find container \"e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d\": container with ID starting with e7162f306d650a9c27fede8c9f54e40ee9d7eb098e20b1324e1902c7c106c43d not found: ID does not exist" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.082239 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.090658 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.098146 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.100055 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.102161 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.102196 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.107637 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155418 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-config-data\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155462 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-public-tls-certs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155487 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f890481e-0c9f-4194-8af3-d808bb105995-logs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155557 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-config-data\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155604 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155623 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155708 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm22d\" (UniqueName: \"kubernetes.io/projected/f890481e-0c9f-4194-8af3-d808bb105995-kube-api-access-cm22d\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155799 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceeec9e1-d0f5-497c-b262-2ef81be261ee-logs\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.155940 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.156060 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.156115 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kg7kr\" (UniqueName: \"kubernetes.io/projected/ceeec9e1-d0f5-497c-b262-2ef81be261ee-kube-api-access-kg7kr\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258081 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kg7kr\" (UniqueName: \"kubernetes.io/projected/ceeec9e1-d0f5-497c-b262-2ef81be261ee-kube-api-access-kg7kr\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-config-data\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258179 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-public-tls-certs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258205 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f890481e-0c9f-4194-8af3-d808bb105995-logs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258243 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-config-data\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258304 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258324 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258354 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm22d\" (UniqueName: \"kubernetes.io/projected/f890481e-0c9f-4194-8af3-d808bb105995-kube-api-access-cm22d\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258381 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceeec9e1-d0f5-497c-b262-2ef81be261ee-logs\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258432 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.258486 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.259230 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ceeec9e1-d0f5-497c-b262-2ef81be261ee-logs\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.259412 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f890481e-0c9f-4194-8af3-d808bb105995-logs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.262106 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.262280 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-config-data\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.262393 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-public-tls-certs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.262588 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-internal-tls-certs\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.263126 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.264064 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ceeec9e1-d0f5-497c-b262-2ef81be261ee-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.266125 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f890481e-0c9f-4194-8af3-d808bb105995-config-data\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.277516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm22d\" (UniqueName: \"kubernetes.io/projected/f890481e-0c9f-4194-8af3-d808bb105995-kube-api-access-cm22d\") pod \"nova-api-0\" (UID: \"f890481e-0c9f-4194-8af3-d808bb105995\") " pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.278465 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kg7kr\" (UniqueName: \"kubernetes.io/projected/ceeec9e1-d0f5-497c-b262-2ef81be261ee-kube-api-access-kg7kr\") pod \"nova-metadata-0\" (UID: \"ceeec9e1-d0f5-497c-b262-2ef81be261ee\") " pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.384233 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.419643 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.458657 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2dac51db-1574-4ccc-bb9a-7c42548d90d3" path="/var/lib/kubelet/pods/2dac51db-1574-4ccc-bb9a-7c42548d90d3/volumes" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.459569 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb9c6096-2ce8-4b43-a638-50374d21d621" path="/var/lib/kubelet/pods/fb9c6096-2ce8-4b43-a638-50374d21d621/volumes" Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.846321 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.943668 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Jan 20 18:52:03 crc kubenswrapper[4773]: I0120 18:52:03.949252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f890481e-0c9f-4194-8af3-d808bb105995","Type":"ContainerStarted","Data":"bda513a35dc2e02b11f1d8251a21deb3f98a74a2b0e988be3d86043c3f03398e"} Jan 20 18:52:04 crc kubenswrapper[4773]: I0120 18:52:04.964233 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ceeec9e1-d0f5-497c-b262-2ef81be261ee","Type":"ContainerStarted","Data":"74c4f6032a4690397bd9e01e0934cf33c43f2651856a5a89dc0042e79dcf3f7e"} Jan 20 18:52:04 crc kubenswrapper[4773]: I0120 18:52:04.965263 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ceeec9e1-d0f5-497c-b262-2ef81be261ee","Type":"ContainerStarted","Data":"f59b265793b3a2f6a3a4f1b9725ac040e95c74c1b9e0810ecddb76d61e7117d0"} Jan 20 18:52:04 crc kubenswrapper[4773]: I0120 18:52:04.965404 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ceeec9e1-d0f5-497c-b262-2ef81be261ee","Type":"ContainerStarted","Data":"2fe6538899a038842344be0b96d2caca61ad6c599b1e50e1b6fce7db44a67138"} Jan 20 18:52:04 crc kubenswrapper[4773]: I0120 18:52:04.966903 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f890481e-0c9f-4194-8af3-d808bb105995","Type":"ContainerStarted","Data":"8405b160f421f8567b8f3aded91e1c635e019292c890f51f0d8d7517dfbfa5cb"} Jan 20 18:52:04 crc kubenswrapper[4773]: I0120 18:52:04.966969 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"f890481e-0c9f-4194-8af3-d808bb105995","Type":"ContainerStarted","Data":"708c8111c2da7df8820e75d69d2feb4a18de77ed285fba274aa43273801e0fa5"} Jan 20 18:52:04 crc kubenswrapper[4773]: I0120 18:52:04.988400 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.988378042 podStartE2EDuration="1.988378042s" podCreationTimestamp="2026-01-20 18:52:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:52:04.982094912 +0000 UTC m=+1317.903907976" watchObservedRunningTime="2026-01-20 18:52:04.988378042 +0000 UTC m=+1317.910191066" Jan 20 18:52:05 crc kubenswrapper[4773]: I0120 18:52:05.008311 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.008293729 podStartE2EDuration="3.008293729s" podCreationTimestamp="2026-01-20 18:52:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:52:05.003419453 +0000 UTC m=+1317.925232477" watchObservedRunningTime="2026-01-20 18:52:05.008293729 +0000 UTC m=+1317.930106753" Jan 20 18:52:06 crc kubenswrapper[4773]: I0120 18:52:06.284083 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Jan 20 18:52:08 crc kubenswrapper[4773]: I0120 18:52:08.420788 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 20 18:52:08 crc kubenswrapper[4773]: I0120 18:52:08.421865 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Jan 20 18:52:11 crc kubenswrapper[4773]: I0120 18:52:11.284659 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Jan 20 18:52:11 crc kubenswrapper[4773]: I0120 18:52:11.309053 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Jan 20 18:52:12 crc kubenswrapper[4773]: I0120 18:52:12.044614 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Jan 20 18:52:12 crc kubenswrapper[4773]: I0120 18:52:12.366582 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 20 18:52:13 crc kubenswrapper[4773]: I0120 18:52:13.385080 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:52:13 crc kubenswrapper[4773]: I0120 18:52:13.385417 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Jan 20 18:52:13 crc kubenswrapper[4773]: I0120 18:52:13.420514 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 20 18:52:13 crc kubenswrapper[4773]: I0120 18:52:13.420576 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Jan 20 18:52:14 crc kubenswrapper[4773]: I0120 18:52:14.399156 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f890481e-0c9f-4194-8af3-d808bb105995" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.194:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:52:14 crc kubenswrapper[4773]: I0120 18:52:14.399198 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="f890481e-0c9f-4194-8af3-d808bb105995" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.194:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:52:14 crc kubenswrapper[4773]: I0120 18:52:14.443735 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ceeec9e1-d0f5-497c-b262-2ef81be261ee" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:52:14 crc kubenswrapper[4773]: I0120 18:52:14.443810 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ceeec9e1-d0f5-497c-b262-2ef81be261ee" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.195:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.395530 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.397696 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.399839 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.405511 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.432426 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.438095 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Jan 20 18:52:23 crc kubenswrapper[4773]: I0120 18:52:23.441352 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 20 18:52:24 crc kubenswrapper[4773]: I0120 18:52:24.144052 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Jan 20 18:52:24 crc kubenswrapper[4773]: I0120 18:52:24.150496 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Jan 20 18:52:24 crc kubenswrapper[4773]: I0120 18:52:24.151305 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Jan 20 18:52:28 crc kubenswrapper[4773]: I0120 18:52:28.170179 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:52:28 crc kubenswrapper[4773]: I0120 18:52:28.170494 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:52:32 crc kubenswrapper[4773]: I0120 18:52:32.485491 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:52:33 crc kubenswrapper[4773]: I0120 18:52:33.441953 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:52:36 crc kubenswrapper[4773]: I0120 18:52:36.707206 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="rabbitmq" containerID="cri-o://c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647" gracePeriod=604796 Jan 20 18:52:37 crc kubenswrapper[4773]: I0120 18:52:37.964861 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="rabbitmq" containerID="cri-o://7ff480cee767ffce07ac3630dc103e93b87e75d3f6036d288b00e3da958c2897" gracePeriod=604796 Jan 20 18:52:42 crc kubenswrapper[4773]: I0120 18:52:42.616561 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.99:5671: connect: connection refused" Jan 20 18:52:42 crc kubenswrapper[4773]: I0120 18:52:42.627397 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.100:5671: connect: connection refused" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.307416 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.344854 4773 generic.go:334] "Generic (PLEG): container finished" podID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerID="c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647" exitCode=0 Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.344899 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d4dfff97-df7d-498f-9203-9c2cb0d84667","Type":"ContainerDied","Data":"c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647"} Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.344944 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"d4dfff97-df7d-498f-9203-9c2cb0d84667","Type":"ContainerDied","Data":"9437201a24daa22de36ef5e4cb32d33d9216523028488aa287392d8e49c9e78c"} Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.344969 4773 scope.go:117] "RemoveContainer" containerID="c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.345109 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.369629 4773 scope.go:117] "RemoveContainer" containerID="1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.401676 4773 scope.go:117] "RemoveContainer" containerID="c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.401951 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-config-data\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.401978 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-plugins\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402002 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-plugins-conf\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402033 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ltrtl\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-kube-api-access-ltrtl\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402080 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-server-conf\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402107 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4dfff97-df7d-498f-9203-9c2cb0d84667-pod-info\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402141 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4dfff97-df7d-498f-9203-9c2cb0d84667-erlang-cookie-secret\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402187 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402207 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-tls\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402224 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-confd\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402252 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-erlang-cookie\") pod \"d4dfff97-df7d-498f-9203-9c2cb0d84667\" (UID: \"d4dfff97-df7d-498f-9203-9c2cb0d84667\") " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.402851 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: E0120 18:52:43.404376 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647\": container with ID starting with c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647 not found: ID does not exist" containerID="c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.404436 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647"} err="failed to get container status \"c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647\": rpc error: code = NotFound desc = could not find container \"c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647\": container with ID starting with c766f55dc2323fb20d95a48fb3c2f0f4589d63fc9f828e80df4feaa4aee53647 not found: ID does not exist" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.404470 4773 scope.go:117] "RemoveContainer" containerID="1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.405771 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: E0120 18:52:43.407776 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f\": container with ID starting with 1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f not found: ID does not exist" containerID="1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.409873 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f"} err="failed to get container status \"1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f\": rpc error: code = NotFound desc = could not find container \"1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f\": container with ID starting with 1248d781571a617de802bfa819cacf5f6c074177291583d629258a80d6ae6c5f not found: ID does not exist" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.408538 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.410185 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-kube-api-access-ltrtl" (OuterVolumeSpecName: "kube-api-access-ltrtl") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "kube-api-access-ltrtl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.411052 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/d4dfff97-df7d-498f-9203-9c2cb0d84667-pod-info" (OuterVolumeSpecName: "pod-info") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.417072 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d4dfff97-df7d-498f-9203-9c2cb0d84667-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.431715 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.433239 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "persistence") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.483498 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-config-data" (OuterVolumeSpecName: "config-data") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.501445 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-server-conf" (OuterVolumeSpecName: "server-conf") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503316 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503345 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503356 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503364 4773 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503373 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ltrtl\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-kube-api-access-ltrtl\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503381 4773 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/d4dfff97-df7d-498f-9203-9c2cb0d84667-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503388 4773 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/d4dfff97-df7d-498f-9203-9c2cb0d84667-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503398 4773 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/d4dfff97-df7d-498f-9203-9c2cb0d84667-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503416 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.503424 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.522679 4773 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.571344 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "d4dfff97-df7d-498f-9203-9c2cb0d84667" (UID: "d4dfff97-df7d-498f-9203-9c2cb0d84667"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.605011 4773 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.605043 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/d4dfff97-df7d-498f-9203-9c2cb0d84667-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.683155 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.690335 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.707433 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:52:43 crc kubenswrapper[4773]: E0120 18:52:43.709245 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="rabbitmq" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.709376 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="rabbitmq" Jan 20 18:52:43 crc kubenswrapper[4773]: E0120 18:52:43.709481 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="setup-container" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.709567 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="setup-container" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.709885 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" containerName="rabbitmq" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.711194 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.717439 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.720272 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-6z6h4" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.720284 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.720274 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.720290 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.720564 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.720600 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.733834 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.910810 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.910863 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhx8m\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-kube-api-access-fhx8m\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.910896 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.910953 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911001 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911022 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911038 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911060 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-config-data\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911091 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911117 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:43 crc kubenswrapper[4773]: I0120 18:52:43.911140 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012581 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012637 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012669 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012712 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012730 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhx8m\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-kube-api-access-fhx8m\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012760 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012787 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012856 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012874 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.012892 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-config-data\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.013863 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-config-data\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.014502 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.014899 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.015945 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.016641 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-server-conf\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.016840 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.017825 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.018281 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.022653 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.025814 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-pod-info\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.029611 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhx8m\" (UniqueName: \"kubernetes.io/projected/375735e1-5d2a-4cc8-892b-4bdcdf9f1e42-kube-api-access-fhx8m\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.063450 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"rabbitmq-server-0\" (UID: \"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42\") " pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.332772 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Jan 20 18:52:44 crc kubenswrapper[4773]: I0120 18:52:44.784185 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Jan 20 18:52:45 crc kubenswrapper[4773]: I0120 18:52:45.381453 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42","Type":"ContainerStarted","Data":"1edf1fc14b34e2adc460b1f25254486fa6076733fa3f6b1dec6d234588f8c56f"} Jan 20 18:52:45 crc kubenswrapper[4773]: I0120 18:52:45.460558 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4dfff97-df7d-498f-9203-9c2cb0d84667" path="/var/lib/kubelet/pods/d4dfff97-df7d-498f-9203-9c2cb0d84667/volumes" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.121688 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-gmkcw"] Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.134445 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.136947 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.140037 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-gmkcw"] Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.302461 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.302533 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.302626 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzxdh\" (UniqueName: \"kubernetes.io/projected/dab58995-9c7e-426a-af86-8c1493d3c8d3-kube-api-access-tzxdh\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.302670 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-config\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.302735 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.302917 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.404380 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.404447 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.404479 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.404512 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.404553 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzxdh\" (UniqueName: \"kubernetes.io/projected/dab58995-9c7e-426a-af86-8c1493d3c8d3-kube-api-access-tzxdh\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.404581 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-config\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.405584 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-config\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.406663 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-nb\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.407214 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-openstack-edpm-ipam\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.407842 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-dns-svc\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.408397 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-sb\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.415774 4773 generic.go:334] "Generic (PLEG): container finished" podID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerID="7ff480cee767ffce07ac3630dc103e93b87e75d3f6036d288b00e3da958c2897" exitCode=0 Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.415828 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b357137a-6e30-4ed9-a440-c9f3e90f75d8","Type":"ContainerDied","Data":"7ff480cee767ffce07ac3630dc103e93b87e75d3f6036d288b00e3da958c2897"} Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.417197 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42","Type":"ContainerStarted","Data":"418d0281e7537f13466deec1b8302c4b39134fd0b052576e8914257e428b8ed2"} Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.430226 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzxdh\" (UniqueName: \"kubernetes.io/projected/dab58995-9c7e-426a-af86-8c1493d3c8d3-kube-api-access-tzxdh\") pod \"dnsmasq-dns-6447ccbd8f-gmkcw\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.456645 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.536274 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709523 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-plugins-conf\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709584 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b357137a-6e30-4ed9-a440-c9f3e90f75d8-erlang-cookie-secret\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709616 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-server-conf\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709643 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b357137a-6e30-4ed9-a440-c9f3e90f75d8-pod-info\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709694 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-erlang-cookie\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709762 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wj6nw\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-kube-api-access-wj6nw\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709795 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-tls\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709861 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-confd\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709895 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-plugins\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709924 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-config-data\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.709975 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\" (UID: \"b357137a-6e30-4ed9-a440-c9f3e90f75d8\") " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.711219 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.714073 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b357137a-6e30-4ed9-a440-c9f3e90f75d8-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.715283 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.716107 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.716212 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "persistence") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.730438 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.731210 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/b357137a-6e30-4ed9-a440-c9f3e90f75d8-pod-info" (OuterVolumeSpecName: "pod-info") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.734150 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-kube-api-access-wj6nw" (OuterVolumeSpecName: "kube-api-access-wj6nw") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "kube-api-access-wj6nw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.746585 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-config-data" (OuterVolumeSpecName: "config-data") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.788536 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-server-conf" (OuterVolumeSpecName: "server-conf") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812464 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812501 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812524 4773 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812533 4773 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812541 4773 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/b357137a-6e30-4ed9-a440-c9f3e90f75d8-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812551 4773 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/b357137a-6e30-4ed9-a440-c9f3e90f75d8-server-conf\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812561 4773 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/b357137a-6e30-4ed9-a440-c9f3e90f75d8-pod-info\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812570 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812580 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wj6nw\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-kube-api-access-wj6nw\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.812588 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.834199 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "b357137a-6e30-4ed9-a440-c9f3e90f75d8" (UID: "b357137a-6e30-4ed9-a440-c9f3e90f75d8"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.839011 4773 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.909416 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-gmkcw"] Jan 20 18:52:47 crc kubenswrapper[4773]: W0120 18:52:47.911471 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddab58995_9c7e_426a_af86_8c1493d3c8d3.slice/crio-0c67ddb7929fcd701c5dbeae79f91dbcac0a54ecd8de24b8c2fb3ad1c9a4e17a WatchSource:0}: Error finding container 0c67ddb7929fcd701c5dbeae79f91dbcac0a54ecd8de24b8c2fb3ad1c9a4e17a: Status 404 returned error can't find the container with id 0c67ddb7929fcd701c5dbeae79f91dbcac0a54ecd8de24b8c2fb3ad1c9a4e17a Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.914682 4773 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/b357137a-6e30-4ed9-a440-c9f3e90f75d8-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:47 crc kubenswrapper[4773]: I0120 18:52:47.914708 4773 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.427279 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"b357137a-6e30-4ed9-a440-c9f3e90f75d8","Type":"ContainerDied","Data":"81b5a2b92f1105f5c420453ca19111fe1ca35ac9507a3ac978f1c848d16b5b05"} Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.427344 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.427350 4773 scope.go:117] "RemoveContainer" containerID="7ff480cee767ffce07ac3630dc103e93b87e75d3f6036d288b00e3da958c2897" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.430733 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" event={"ID":"dab58995-9c7e-426a-af86-8c1493d3c8d3","Type":"ContainerStarted","Data":"0c67ddb7929fcd701c5dbeae79f91dbcac0a54ecd8de24b8c2fb3ad1c9a4e17a"} Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.468262 4773 scope.go:117] "RemoveContainer" containerID="582308e76cf42821e4ae7402e4d5fe864dee8caf26b6d7f6b99985263eaa82fb" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.487942 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.500686 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.510663 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:52:48 crc kubenswrapper[4773]: E0120 18:52:48.511107 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="rabbitmq" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.511126 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="rabbitmq" Jan 20 18:52:48 crc kubenswrapper[4773]: E0120 18:52:48.511142 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="setup-container" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.511148 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="setup-container" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.511324 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" containerName="rabbitmq" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.512353 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.514752 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.515085 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.515218 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.515416 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.515571 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-pbqbk" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.517247 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.517494 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.573761 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624215 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624260 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624288 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624308 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624324 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624342 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35926f65-848d-4db5-b50a-deef510ce4be-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624361 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbddj\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-kube-api-access-mbddj\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624387 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35926f65-848d-4db5-b50a-deef510ce4be-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624416 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624429 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.624495 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726164 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726226 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726283 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726307 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35926f65-848d-4db5-b50a-deef510ce4be-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726334 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbddj\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-kube-api-access-mbddj\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726372 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35926f65-848d-4db5-b50a-deef510ce4be-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726414 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726438 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726516 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.726591 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.727189 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") device mount path \"/mnt/openstack/pv09\"" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.727307 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.727431 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.727463 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.728026 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.728148 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/35926f65-848d-4db5-b50a-deef510ce4be-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.732432 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/35926f65-848d-4db5-b50a-deef510ce4be-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.734250 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.734692 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/35926f65-848d-4db5-b50a-deef510ce4be-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.739085 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.745809 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbddj\" (UniqueName: \"kubernetes.io/projected/35926f65-848d-4db5-b50a-deef510ce4be-kube-api-access-mbddj\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.761582 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"35926f65-848d-4db5-b50a-deef510ce4be\") " pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:48 crc kubenswrapper[4773]: I0120 18:52:48.907430 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:52:49 crc kubenswrapper[4773]: I0120 18:52:49.457654 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b357137a-6e30-4ed9-a440-c9f3e90f75d8" path="/var/lib/kubelet/pods/b357137a-6e30-4ed9-a440-c9f3e90f75d8/volumes" Jan 20 18:52:50 crc kubenswrapper[4773]: I0120 18:52:50.233003 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Jan 20 18:52:50 crc kubenswrapper[4773]: W0120 18:52:50.233674 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35926f65_848d_4db5_b50a_deef510ce4be.slice/crio-d840dc4a39867d21e70a35ebc03384578e159d52d3e5d5c6c6b32a11f062cb63 WatchSource:0}: Error finding container d840dc4a39867d21e70a35ebc03384578e159d52d3e5d5c6c6b32a11f062cb63: Status 404 returned error can't find the container with id d840dc4a39867d21e70a35ebc03384578e159d52d3e5d5c6c6b32a11f062cb63 Jan 20 18:52:50 crc kubenswrapper[4773]: I0120 18:52:50.471372 4773 generic.go:334] "Generic (PLEG): container finished" podID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerID="07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb" exitCode=0 Jan 20 18:52:50 crc kubenswrapper[4773]: I0120 18:52:50.473347 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" event={"ID":"dab58995-9c7e-426a-af86-8c1493d3c8d3","Type":"ContainerDied","Data":"07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb"} Jan 20 18:52:50 crc kubenswrapper[4773]: I0120 18:52:50.476471 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35926f65-848d-4db5-b50a-deef510ce4be","Type":"ContainerStarted","Data":"d840dc4a39867d21e70a35ebc03384578e159d52d3e5d5c6c6b32a11f062cb63"} Jan 20 18:52:51 crc kubenswrapper[4773]: I0120 18:52:51.485860 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" event={"ID":"dab58995-9c7e-426a-af86-8c1493d3c8d3","Type":"ContainerStarted","Data":"978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0"} Jan 20 18:52:51 crc kubenswrapper[4773]: I0120 18:52:51.486313 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:51 crc kubenswrapper[4773]: I0120 18:52:51.504179 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" podStartSLOduration=4.504162007 podStartE2EDuration="4.504162007s" podCreationTimestamp="2026-01-20 18:52:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:52:51.502653971 +0000 UTC m=+1364.424466995" watchObservedRunningTime="2026-01-20 18:52:51.504162007 +0000 UTC m=+1364.425975031" Jan 20 18:52:52 crc kubenswrapper[4773]: I0120 18:52:52.494900 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35926f65-848d-4db5-b50a-deef510ce4be","Type":"ContainerStarted","Data":"87f37741dad884ad3f582962088de19df0c82ee8e7e843bdb0ffb8ddabb1883f"} Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.457729 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.519795 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-n5s7s"] Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.520104 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="dnsmasq-dns" containerID="cri-o://ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7" gracePeriod=10 Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.637115 4773 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.188:5353: connect: connection refused" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.661294 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-42g4p"] Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.663543 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.672370 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-42g4p"] Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.697970 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.698018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.698249 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-config\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.698349 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.698570 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m92nd\" (UniqueName: \"kubernetes.io/projected/ba4ab073-f712-41fb-9b44-d83a19b72973-kube-api-access-m92nd\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.698626 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.801803 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-config\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.802385 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.802433 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m92nd\" (UniqueName: \"kubernetes.io/projected/ba4ab073-f712-41fb-9b44-d83a19b72973-kube-api-access-m92nd\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.802456 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.802571 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.802601 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.802665 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-config\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.803343 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-dns-svc\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.803497 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-sb\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.803554 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-openstack-edpm-ipam\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.803649 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-nb\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:57 crc kubenswrapper[4773]: I0120 18:52:57.823907 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m92nd\" (UniqueName: \"kubernetes.io/projected/ba4ab073-f712-41fb-9b44-d83a19b72973-kube-api-access-m92nd\") pod \"dnsmasq-dns-864d5fc68c-42g4p\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.038557 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.170656 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.171042 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.171093 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.171801 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"14437c4854d46fdc109569c8299e656b31c4aa4992133183b5fa2d3fd5cee7bb"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.171861 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://14437c4854d46fdc109569c8299e656b31c4aa4992133183b5fa2d3fd5cee7bb" gracePeriod=600 Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.453894 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.511094 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-42g4p"] Jan 20 18:52:58 crc kubenswrapper[4773]: W0120 18:52:58.514753 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podba4ab073_f712_41fb_9b44_d83a19b72973.slice/crio-22400a6c79ad5ab061f9a63ebc6202a7ab7e2454b3e31cdcb85e885543c42eb5 WatchSource:0}: Error finding container 22400a6c79ad5ab061f9a63ebc6202a7ab7e2454b3e31cdcb85e885543c42eb5: Status 404 returned error can't find the container with id 22400a6c79ad5ab061f9a63ebc6202a7ab7e2454b3e31cdcb85e885543c42eb5 Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.523034 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-nb\") pod \"cb02b4c0-80ac-4860-8877-f507f8bc2028\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.523118 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-config\") pod \"cb02b4c0-80ac-4860-8877-f507f8bc2028\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.523327 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6sxl\" (UniqueName: \"kubernetes.io/projected/cb02b4c0-80ac-4860-8877-f507f8bc2028-kube-api-access-l6sxl\") pod \"cb02b4c0-80ac-4860-8877-f507f8bc2028\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.523366 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-sb\") pod \"cb02b4c0-80ac-4860-8877-f507f8bc2028\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.523420 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-dns-svc\") pod \"cb02b4c0-80ac-4860-8877-f507f8bc2028\" (UID: \"cb02b4c0-80ac-4860-8877-f507f8bc2028\") " Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.531090 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb02b4c0-80ac-4860-8877-f507f8bc2028-kube-api-access-l6sxl" (OuterVolumeSpecName: "kube-api-access-l6sxl") pod "cb02b4c0-80ac-4860-8877-f507f8bc2028" (UID: "cb02b4c0-80ac-4860-8877-f507f8bc2028"). InnerVolumeSpecName "kube-api-access-l6sxl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.563349 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" event={"ID":"ba4ab073-f712-41fb-9b44-d83a19b72973","Type":"ContainerStarted","Data":"22400a6c79ad5ab061f9a63ebc6202a7ab7e2454b3e31cdcb85e885543c42eb5"} Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.565122 4773 generic.go:334] "Generic (PLEG): container finished" podID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerID="ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7" exitCode=0 Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.565279 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.566245 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" event={"ID":"cb02b4c0-80ac-4860-8877-f507f8bc2028","Type":"ContainerDied","Data":"ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7"} Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.566282 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5b856c5697-n5s7s" event={"ID":"cb02b4c0-80ac-4860-8877-f507f8bc2028","Type":"ContainerDied","Data":"d35e4b0ceb4787bad4a95ece01b001352569b9bbc51780aec8a8c24c4fa207e2"} Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.566312 4773 scope.go:117] "RemoveContainer" containerID="ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.573508 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="14437c4854d46fdc109569c8299e656b31c4aa4992133183b5fa2d3fd5cee7bb" exitCode=0 Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.573553 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"14437c4854d46fdc109569c8299e656b31c4aa4992133183b5fa2d3fd5cee7bb"} Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.573585 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630"} Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.581810 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cb02b4c0-80ac-4860-8877-f507f8bc2028" (UID: "cb02b4c0-80ac-4860-8877-f507f8bc2028"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.590352 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-config" (OuterVolumeSpecName: "config") pod "cb02b4c0-80ac-4860-8877-f507f8bc2028" (UID: "cb02b4c0-80ac-4860-8877-f507f8bc2028"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.590613 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cb02b4c0-80ac-4860-8877-f507f8bc2028" (UID: "cb02b4c0-80ac-4860-8877-f507f8bc2028"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.595872 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb02b4c0-80ac-4860-8877-f507f8bc2028" (UID: "cb02b4c0-80ac-4860-8877-f507f8bc2028"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.623761 4773 scope.go:117] "RemoveContainer" containerID="69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.628381 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.628415 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6sxl\" (UniqueName: \"kubernetes.io/projected/cb02b4c0-80ac-4860-8877-f507f8bc2028-kube-api-access-l6sxl\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.628436 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.628450 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.628461 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb02b4c0-80ac-4860-8877-f507f8bc2028-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.709554 4773 scope.go:117] "RemoveContainer" containerID="ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7" Jan 20 18:52:58 crc kubenswrapper[4773]: E0120 18:52:58.710267 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7\": container with ID starting with ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7 not found: ID does not exist" containerID="ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.710317 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7"} err="failed to get container status \"ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7\": rpc error: code = NotFound desc = could not find container \"ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7\": container with ID starting with ac3896caf2212fe902aa7a23816384e518b7b696af2e446e0c73bfd22421a6c7 not found: ID does not exist" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.710345 4773 scope.go:117] "RemoveContainer" containerID="69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd" Jan 20 18:52:58 crc kubenswrapper[4773]: E0120 18:52:58.711176 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd\": container with ID starting with 69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd not found: ID does not exist" containerID="69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.711237 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd"} err="failed to get container status \"69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd\": rpc error: code = NotFound desc = could not find container \"69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd\": container with ID starting with 69bef84f465e5a23bb9b67d0db87c263ac661afd9f941c9238b37b4e986cc9dd not found: ID does not exist" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.711269 4773 scope.go:117] "RemoveContainer" containerID="f170dc7a6e6cd01e0186e6b45c72b1bd89b3220f96cf1e35088901106c87b344" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.771893 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8"] Jan 20 18:52:58 crc kubenswrapper[4773]: E0120 18:52:58.772293 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="init" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.772309 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="init" Jan 20 18:52:58 crc kubenswrapper[4773]: E0120 18:52:58.772318 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="dnsmasq-dns" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.772325 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="dnsmasq-dns" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.772482 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" containerName="dnsmasq-dns" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.773238 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.777317 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.777578 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.779127 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.779162 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.786890 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8"] Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.837913 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.838202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.838278 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.838410 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pvb59\" (UniqueName: \"kubernetes.io/projected/02289c77-b6e5-4419-8dc4-597648db0e01-kube-api-access-pvb59\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.940368 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pvb59\" (UniqueName: \"kubernetes.io/projected/02289c77-b6e5-4419-8dc4-597648db0e01-kube-api-access-pvb59\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.940476 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.940543 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.940572 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.951413 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.951644 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-n5s7s"] Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.959596 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.960516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.961413 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5b856c5697-n5s7s"] Jan 20 18:52:58 crc kubenswrapper[4773]: I0120 18:52:58.967641 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pvb59\" (UniqueName: \"kubernetes.io/projected/02289c77-b6e5-4419-8dc4-597648db0e01-kube-api-access-pvb59\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:59 crc kubenswrapper[4773]: I0120 18:52:59.119308 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:52:59 crc kubenswrapper[4773]: I0120 18:52:59.458629 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb02b4c0-80ac-4860-8877-f507f8bc2028" path="/var/lib/kubelet/pods/cb02b4c0-80ac-4860-8877-f507f8bc2028/volumes" Jan 20 18:52:59 crc kubenswrapper[4773]: I0120 18:52:59.588457 4773 generic.go:334] "Generic (PLEG): container finished" podID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerID="f7aec563576030ac1c13e7cfb223eea2c4098b2ad34114c7a7b21eb120d4d273" exitCode=0 Jan 20 18:52:59 crc kubenswrapper[4773]: I0120 18:52:59.588527 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" event={"ID":"ba4ab073-f712-41fb-9b44-d83a19b72973","Type":"ContainerDied","Data":"f7aec563576030ac1c13e7cfb223eea2c4098b2ad34114c7a7b21eb120d4d273"} Jan 20 18:52:59 crc kubenswrapper[4773]: I0120 18:52:59.683296 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8"] Jan 20 18:52:59 crc kubenswrapper[4773]: W0120 18:52:59.691216 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod02289c77_b6e5_4419_8dc4_597648db0e01.slice/crio-eb4dfa5a9aefc0d318d78899613e93d3a3d04049c2d7b326336614864fa414ec WatchSource:0}: Error finding container eb4dfa5a9aefc0d318d78899613e93d3a3d04049c2d7b326336614864fa414ec: Status 404 returned error can't find the container with id eb4dfa5a9aefc0d318d78899613e93d3a3d04049c2d7b326336614864fa414ec Jan 20 18:52:59 crc kubenswrapper[4773]: I0120 18:52:59.693942 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 18:53:00 crc kubenswrapper[4773]: I0120 18:53:00.604262 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" event={"ID":"02289c77-b6e5-4419-8dc4-597648db0e01","Type":"ContainerStarted","Data":"eb4dfa5a9aefc0d318d78899613e93d3a3d04049c2d7b326336614864fa414ec"} Jan 20 18:53:00 crc kubenswrapper[4773]: I0120 18:53:00.607300 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" event={"ID":"ba4ab073-f712-41fb-9b44-d83a19b72973","Type":"ContainerStarted","Data":"0db8dcf4f0eb56d128fff23942f9569af545eaf303471ed6ae63a9dec8023480"} Jan 20 18:53:00 crc kubenswrapper[4773]: I0120 18:53:00.607849 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:53:00 crc kubenswrapper[4773]: I0120 18:53:00.630103 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" podStartSLOduration=3.630079628 podStartE2EDuration="3.630079628s" podCreationTimestamp="2026-01-20 18:52:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:53:00.627376344 +0000 UTC m=+1373.549189378" watchObservedRunningTime="2026-01-20 18:53:00.630079628 +0000 UTC m=+1373.551892652" Jan 20 18:53:08 crc kubenswrapper[4773]: I0120 18:53:08.040227 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 18:53:08 crc kubenswrapper[4773]: I0120 18:53:08.145016 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-gmkcw"] Jan 20 18:53:08 crc kubenswrapper[4773]: I0120 18:53:08.145333 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerName="dnsmasq-dns" containerID="cri-o://978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0" gracePeriod=10 Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.485697 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.679148 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.695974 4773 generic.go:334] "Generic (PLEG): container finished" podID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerID="978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0" exitCode=0 Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.696038 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.696053 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" event={"ID":"dab58995-9c7e-426a-af86-8c1493d3c8d3","Type":"ContainerDied","Data":"978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0"} Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.696082 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6447ccbd8f-gmkcw" event={"ID":"dab58995-9c7e-426a-af86-8c1493d3c8d3","Type":"ContainerDied","Data":"0c67ddb7929fcd701c5dbeae79f91dbcac0a54ecd8de24b8c2fb3ad1c9a4e17a"} Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.696096 4773 scope.go:117] "RemoveContainer" containerID="978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.743898 4773 scope.go:117] "RemoveContainer" containerID="07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.764352 4773 scope.go:117] "RemoveContainer" containerID="978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0" Jan 20 18:53:09 crc kubenswrapper[4773]: E0120 18:53:09.764950 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0\": container with ID starting with 978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0 not found: ID does not exist" containerID="978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.764994 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0"} err="failed to get container status \"978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0\": rpc error: code = NotFound desc = could not find container \"978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0\": container with ID starting with 978849f04e8c12b94a4bd7fb3a46f891ccab994e5c9933dcc5e76f7eb85b5fc0 not found: ID does not exist" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.765021 4773 scope.go:117] "RemoveContainer" containerID="07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb" Jan 20 18:53:09 crc kubenswrapper[4773]: E0120 18:53:09.765343 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb\": container with ID starting with 07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb not found: ID does not exist" containerID="07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.765376 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb"} err="failed to get container status \"07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb\": rpc error: code = NotFound desc = could not find container \"07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb\": container with ID starting with 07da0829964a4fd5318fef4d1177e8c728a3eaf09e5d727c478bed39b7f42bbb not found: ID does not exist" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.862522 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-openstack-edpm-ipam\") pod \"dab58995-9c7e-426a-af86-8c1493d3c8d3\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.862637 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-sb\") pod \"dab58995-9c7e-426a-af86-8c1493d3c8d3\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.862783 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzxdh\" (UniqueName: \"kubernetes.io/projected/dab58995-9c7e-426a-af86-8c1493d3c8d3-kube-api-access-tzxdh\") pod \"dab58995-9c7e-426a-af86-8c1493d3c8d3\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.862841 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-config\") pod \"dab58995-9c7e-426a-af86-8c1493d3c8d3\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.862862 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-nb\") pod \"dab58995-9c7e-426a-af86-8c1493d3c8d3\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.862881 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-dns-svc\") pod \"dab58995-9c7e-426a-af86-8c1493d3c8d3\" (UID: \"dab58995-9c7e-426a-af86-8c1493d3c8d3\") " Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.868997 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dab58995-9c7e-426a-af86-8c1493d3c8d3-kube-api-access-tzxdh" (OuterVolumeSpecName: "kube-api-access-tzxdh") pod "dab58995-9c7e-426a-af86-8c1493d3c8d3" (UID: "dab58995-9c7e-426a-af86-8c1493d3c8d3"). InnerVolumeSpecName "kube-api-access-tzxdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.904259 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "dab58995-9c7e-426a-af86-8c1493d3c8d3" (UID: "dab58995-9c7e-426a-af86-8c1493d3c8d3"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.906812 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "dab58995-9c7e-426a-af86-8c1493d3c8d3" (UID: "dab58995-9c7e-426a-af86-8c1493d3c8d3"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.911515 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-config" (OuterVolumeSpecName: "config") pod "dab58995-9c7e-426a-af86-8c1493d3c8d3" (UID: "dab58995-9c7e-426a-af86-8c1493d3c8d3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.920094 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "dab58995-9c7e-426a-af86-8c1493d3c8d3" (UID: "dab58995-9c7e-426a-af86-8c1493d3c8d3"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.939322 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "dab58995-9c7e-426a-af86-8c1493d3c8d3" (UID: "dab58995-9c7e-426a-af86-8c1493d3c8d3"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.968832 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzxdh\" (UniqueName: \"kubernetes.io/projected/dab58995-9c7e-426a-af86-8c1493d3c8d3-kube-api-access-tzxdh\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.969083 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-config\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.969190 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.969254 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.969335 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:09 crc kubenswrapper[4773]: I0120 18:53:09.969739 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/dab58995-9c7e-426a-af86-8c1493d3c8d3-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:10 crc kubenswrapper[4773]: I0120 18:53:10.047361 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-gmkcw"] Jan 20 18:53:10 crc kubenswrapper[4773]: I0120 18:53:10.055567 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6447ccbd8f-gmkcw"] Jan 20 18:53:10 crc kubenswrapper[4773]: I0120 18:53:10.707436 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" event={"ID":"02289c77-b6e5-4419-8dc4-597648db0e01","Type":"ContainerStarted","Data":"55661dbd2f275de065e0d3e995f3bc1225e2c652531b5ee1235ad9050f245384"} Jan 20 18:53:10 crc kubenswrapper[4773]: I0120 18:53:10.722598 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" podStartSLOduration=2.933150977 podStartE2EDuration="12.722580042s" podCreationTimestamp="2026-01-20 18:52:58 +0000 UTC" firstStartedPulling="2026-01-20 18:52:59.693697444 +0000 UTC m=+1372.615510468" lastFinishedPulling="2026-01-20 18:53:09.483126509 +0000 UTC m=+1382.404939533" observedRunningTime="2026-01-20 18:53:10.718790862 +0000 UTC m=+1383.640603906" watchObservedRunningTime="2026-01-20 18:53:10.722580042 +0000 UTC m=+1383.644393066" Jan 20 18:53:11 crc kubenswrapper[4773]: I0120 18:53:11.458456 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" path="/var/lib/kubelet/pods/dab58995-9c7e-426a-af86-8c1493d3c8d3/volumes" Jan 20 18:53:18 crc kubenswrapper[4773]: I0120 18:53:18.778372 4773 generic.go:334] "Generic (PLEG): container finished" podID="375735e1-5d2a-4cc8-892b-4bdcdf9f1e42" containerID="418d0281e7537f13466deec1b8302c4b39134fd0b052576e8914257e428b8ed2" exitCode=0 Jan 20 18:53:18 crc kubenswrapper[4773]: I0120 18:53:18.778508 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42","Type":"ContainerDied","Data":"418d0281e7537f13466deec1b8302c4b39134fd0b052576e8914257e428b8ed2"} Jan 20 18:53:19 crc kubenswrapper[4773]: I0120 18:53:19.792437 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"375735e1-5d2a-4cc8-892b-4bdcdf9f1e42","Type":"ContainerStarted","Data":"e2d8b223c68401e4f892191eb30679316dc5a26d5f2beba30b7c02008ae6195a"} Jan 20 18:53:19 crc kubenswrapper[4773]: I0120 18:53:19.793298 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Jan 20 18:53:19 crc kubenswrapper[4773]: I0120 18:53:19.817259 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.8172348 podStartE2EDuration="36.8172348s" podCreationTimestamp="2026-01-20 18:52:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:53:19.813696326 +0000 UTC m=+1392.735509360" watchObservedRunningTime="2026-01-20 18:53:19.8172348 +0000 UTC m=+1392.739047824" Jan 20 18:53:20 crc kubenswrapper[4773]: I0120 18:53:20.801175 4773 generic.go:334] "Generic (PLEG): container finished" podID="02289c77-b6e5-4419-8dc4-597648db0e01" containerID="55661dbd2f275de065e0d3e995f3bc1225e2c652531b5ee1235ad9050f245384" exitCode=0 Jan 20 18:53:20 crc kubenswrapper[4773]: I0120 18:53:20.801715 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" event={"ID":"02289c77-b6e5-4419-8dc4-597648db0e01","Type":"ContainerDied","Data":"55661dbd2f275de065e0d3e995f3bc1225e2c652531b5ee1235ad9050f245384"} Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.196617 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.294731 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-ssh-key-openstack-edpm-ipam\") pod \"02289c77-b6e5-4419-8dc4-597648db0e01\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.294910 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pvb59\" (UniqueName: \"kubernetes.io/projected/02289c77-b6e5-4419-8dc4-597648db0e01-kube-api-access-pvb59\") pod \"02289c77-b6e5-4419-8dc4-597648db0e01\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.295103 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-inventory\") pod \"02289c77-b6e5-4419-8dc4-597648db0e01\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.295133 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-repo-setup-combined-ca-bundle\") pod \"02289c77-b6e5-4419-8dc4-597648db0e01\" (UID: \"02289c77-b6e5-4419-8dc4-597648db0e01\") " Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.308328 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "02289c77-b6e5-4419-8dc4-597648db0e01" (UID: "02289c77-b6e5-4419-8dc4-597648db0e01"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.308355 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/02289c77-b6e5-4419-8dc4-597648db0e01-kube-api-access-pvb59" (OuterVolumeSpecName: "kube-api-access-pvb59") pod "02289c77-b6e5-4419-8dc4-597648db0e01" (UID: "02289c77-b6e5-4419-8dc4-597648db0e01"). InnerVolumeSpecName "kube-api-access-pvb59". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.324134 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-inventory" (OuterVolumeSpecName: "inventory") pod "02289c77-b6e5-4419-8dc4-597648db0e01" (UID: "02289c77-b6e5-4419-8dc4-597648db0e01"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.335138 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "02289c77-b6e5-4419-8dc4-597648db0e01" (UID: "02289c77-b6e5-4419-8dc4-597648db0e01"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.397492 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.397528 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pvb59\" (UniqueName: \"kubernetes.io/projected/02289c77-b6e5-4419-8dc4-597648db0e01-kube-api-access-pvb59\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.397538 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.397547 4773 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/02289c77-b6e5-4419-8dc4-597648db0e01-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.821249 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" event={"ID":"02289c77-b6e5-4419-8dc4-597648db0e01","Type":"ContainerDied","Data":"eb4dfa5a9aefc0d318d78899613e93d3a3d04049c2d7b326336614864fa414ec"} Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.821294 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb4dfa5a9aefc0d318d78899613e93d3a3d04049c2d7b326336614864fa414ec" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.821294 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.888699 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8"] Jan 20 18:53:22 crc kubenswrapper[4773]: E0120 18:53:22.889070 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="02289c77-b6e5-4419-8dc4-597648db0e01" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.889084 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="02289c77-b6e5-4419-8dc4-597648db0e01" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 20 18:53:22 crc kubenswrapper[4773]: E0120 18:53:22.889101 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerName="dnsmasq-dns" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.889107 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerName="dnsmasq-dns" Jan 20 18:53:22 crc kubenswrapper[4773]: E0120 18:53:22.889128 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerName="init" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.889134 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerName="init" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.889282 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="02289c77-b6e5-4419-8dc4-597648db0e01" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.889296 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="dab58995-9c7e-426a-af86-8c1493d3c8d3" containerName="dnsmasq-dns" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.889861 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.892295 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.892306 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.892472 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.892511 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:53:22 crc kubenswrapper[4773]: I0120 18:53:22.918833 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8"] Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.010592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.010682 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.010787 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.010941 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nrx7x\" (UniqueName: \"kubernetes.io/projected/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-kube-api-access-nrx7x\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.112461 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.112535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.112566 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.113284 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nrx7x\" (UniqueName: \"kubernetes.io/projected/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-kube-api-access-nrx7x\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.116529 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.116918 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.118040 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.141303 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nrx7x\" (UniqueName: \"kubernetes.io/projected/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-kube-api-access-nrx7x\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.218400 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.728244 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8"] Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.833175 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" event={"ID":"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f","Type":"ContainerStarted","Data":"b53c5213d4b438728dd937da04ea956d9066ef8724ab70bbeae3ff997481a9d6"} Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.835412 4773 generic.go:334] "Generic (PLEG): container finished" podID="35926f65-848d-4db5-b50a-deef510ce4be" containerID="87f37741dad884ad3f582962088de19df0c82ee8e7e843bdb0ffb8ddabb1883f" exitCode=0 Jan 20 18:53:23 crc kubenswrapper[4773]: I0120 18:53:23.835494 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35926f65-848d-4db5-b50a-deef510ce4be","Type":"ContainerDied","Data":"87f37741dad884ad3f582962088de19df0c82ee8e7e843bdb0ffb8ddabb1883f"} Jan 20 18:53:24 crc kubenswrapper[4773]: I0120 18:53:24.845855 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" event={"ID":"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f","Type":"ContainerStarted","Data":"ed1b733b25627aeb8a223630419832be2d7010197f1a7fbda2c35be60b1cd9f1"} Jan 20 18:53:24 crc kubenswrapper[4773]: I0120 18:53:24.848782 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"35926f65-848d-4db5-b50a-deef510ce4be","Type":"ContainerStarted","Data":"5056057c1e78e0ee19930122cdad3818102dc9ef9fb981a2ffd2a8d11e61ab6e"} Jan 20 18:53:24 crc kubenswrapper[4773]: I0120 18:53:24.849098 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:53:24 crc kubenswrapper[4773]: I0120 18:53:24.868751 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" podStartSLOduration=2.4265908339999998 podStartE2EDuration="2.868738632s" podCreationTimestamp="2026-01-20 18:53:22 +0000 UTC" firstStartedPulling="2026-01-20 18:53:23.738059336 +0000 UTC m=+1396.659872360" lastFinishedPulling="2026-01-20 18:53:24.180207134 +0000 UTC m=+1397.102020158" observedRunningTime="2026-01-20 18:53:24.860741072 +0000 UTC m=+1397.782554096" watchObservedRunningTime="2026-01-20 18:53:24.868738632 +0000 UTC m=+1397.790551646" Jan 20 18:53:24 crc kubenswrapper[4773]: I0120 18:53:24.889615 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=36.889599809 podStartE2EDuration="36.889599809s" podCreationTimestamp="2026-01-20 18:52:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 18:53:24.88671194 +0000 UTC m=+1397.808524984" watchObservedRunningTime="2026-01-20 18:53:24.889599809 +0000 UTC m=+1397.811412833" Jan 20 18:53:34 crc kubenswrapper[4773]: I0120 18:53:34.336083 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Jan 20 18:53:38 crc kubenswrapper[4773]: I0120 18:53:38.911233 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.762461 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-gntwg"] Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.765449 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.778666 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gntwg"] Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.877275 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-utilities\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.877343 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-catalog-content\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.877382 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlrt8\" (UniqueName: \"kubernetes.io/projected/6af7e52c-fffc-47ba-88de-3340d26e02d5-kube-api-access-rlrt8\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.979312 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-utilities\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.979381 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-catalog-content\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.979429 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlrt8\" (UniqueName: \"kubernetes.io/projected/6af7e52c-fffc-47ba-88de-3340d26e02d5-kube-api-access-rlrt8\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.980497 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-utilities\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.980718 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-catalog-content\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:41 crc kubenswrapper[4773]: I0120 18:53:41.999111 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlrt8\" (UniqueName: \"kubernetes.io/projected/6af7e52c-fffc-47ba-88de-3340d26e02d5-kube-api-access-rlrt8\") pod \"redhat-operators-gntwg\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:42 crc kubenswrapper[4773]: I0120 18:53:42.087476 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:42 crc kubenswrapper[4773]: I0120 18:53:42.560625 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-gntwg"] Jan 20 18:53:43 crc kubenswrapper[4773]: I0120 18:53:43.053073 4773 generic.go:334] "Generic (PLEG): container finished" podID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerID="669f605e449c86e68ef6da15acaef216606728ad06e5b08f31210322560a5191" exitCode=0 Jan 20 18:53:43 crc kubenswrapper[4773]: I0120 18:53:43.053128 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerDied","Data":"669f605e449c86e68ef6da15acaef216606728ad06e5b08f31210322560a5191"} Jan 20 18:53:43 crc kubenswrapper[4773]: I0120 18:53:43.053351 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerStarted","Data":"a28dca4f7f445301eea149b36d0aab07434e48fff601378c8c14b9f9e5ae5a54"} Jan 20 18:53:45 crc kubenswrapper[4773]: I0120 18:53:45.070480 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerStarted","Data":"76ed65a63e898ee4e204311b4b65b2b0d63ba5a71d524c3c22e45cfb49527ec0"} Jan 20 18:53:45 crc kubenswrapper[4773]: I0120 18:53:45.952690 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-krc65"] Jan 20 18:53:45 crc kubenswrapper[4773]: I0120 18:53:45.955264 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:45 crc kubenswrapper[4773]: I0120 18:53:45.962917 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krc65"] Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.052378 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-utilities\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.052494 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhmsf\" (UniqueName: \"kubernetes.io/projected/68194968-898d-49f9-a430-4732bb8122d5-kube-api-access-mhmsf\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.052588 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-catalog-content\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.153998 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhmsf\" (UniqueName: \"kubernetes.io/projected/68194968-898d-49f9-a430-4732bb8122d5-kube-api-access-mhmsf\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.154152 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-catalog-content\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.154240 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-utilities\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.154772 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-catalog-content\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.154799 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-utilities\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.188558 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhmsf\" (UniqueName: \"kubernetes.io/projected/68194968-898d-49f9-a430-4732bb8122d5-kube-api-access-mhmsf\") pod \"redhat-marketplace-krc65\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.289122 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:46 crc kubenswrapper[4773]: I0120 18:53:46.806775 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krc65"] Jan 20 18:53:47 crc kubenswrapper[4773]: I0120 18:53:47.084754 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krc65" event={"ID":"68194968-898d-49f9-a430-4732bb8122d5","Type":"ContainerStarted","Data":"030c4e3fa1061815a067dd4271885fdcb782aadd09e4a899d330c43028fd9dd4"} Jan 20 18:53:47 crc kubenswrapper[4773]: I0120 18:53:47.087101 4773 generic.go:334] "Generic (PLEG): container finished" podID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerID="76ed65a63e898ee4e204311b4b65b2b0d63ba5a71d524c3c22e45cfb49527ec0" exitCode=0 Jan 20 18:53:47 crc kubenswrapper[4773]: I0120 18:53:47.087131 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerDied","Data":"76ed65a63e898ee4e204311b4b65b2b0d63ba5a71d524c3c22e45cfb49527ec0"} Jan 20 18:53:49 crc kubenswrapper[4773]: I0120 18:53:49.103867 4773 generic.go:334] "Generic (PLEG): container finished" podID="68194968-898d-49f9-a430-4732bb8122d5" containerID="091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532" exitCode=0 Jan 20 18:53:49 crc kubenswrapper[4773]: I0120 18:53:49.103948 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krc65" event={"ID":"68194968-898d-49f9-a430-4732bb8122d5","Type":"ContainerDied","Data":"091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532"} Jan 20 18:53:49 crc kubenswrapper[4773]: I0120 18:53:49.108404 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerStarted","Data":"f030360740024828909b92b92df5b441a156e4058ef2561aa7a4a7c78f240f40"} Jan 20 18:53:49 crc kubenswrapper[4773]: I0120 18:53:49.157734 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-gntwg" podStartSLOduration=2.681690653 podStartE2EDuration="8.157706083s" podCreationTimestamp="2026-01-20 18:53:41 +0000 UTC" firstStartedPulling="2026-01-20 18:53:43.05516823 +0000 UTC m=+1415.976981254" lastFinishedPulling="2026-01-20 18:53:48.53118366 +0000 UTC m=+1421.452996684" observedRunningTime="2026-01-20 18:53:49.147701595 +0000 UTC m=+1422.069514639" watchObservedRunningTime="2026-01-20 18:53:49.157706083 +0000 UTC m=+1422.079519107" Jan 20 18:53:51 crc kubenswrapper[4773]: I0120 18:53:51.129626 4773 generic.go:334] "Generic (PLEG): container finished" podID="68194968-898d-49f9-a430-4732bb8122d5" containerID="34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1" exitCode=0 Jan 20 18:53:51 crc kubenswrapper[4773]: I0120 18:53:51.129741 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krc65" event={"ID":"68194968-898d-49f9-a430-4732bb8122d5","Type":"ContainerDied","Data":"34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1"} Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.088097 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.089428 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.159994 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krc65" event={"ID":"68194968-898d-49f9-a430-4732bb8122d5","Type":"ContainerStarted","Data":"bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2"} Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.193960 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-krc65" podStartSLOduration=4.667265675 podStartE2EDuration="7.193927096s" podCreationTimestamp="2026-01-20 18:53:45 +0000 UTC" firstStartedPulling="2026-01-20 18:53:49.106232689 +0000 UTC m=+1422.028045713" lastFinishedPulling="2026-01-20 18:53:51.63289411 +0000 UTC m=+1424.554707134" observedRunningTime="2026-01-20 18:53:52.184417649 +0000 UTC m=+1425.106230673" watchObservedRunningTime="2026-01-20 18:53:52.193927096 +0000 UTC m=+1425.115740120" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.361914 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2hmn8"] Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.364662 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.401857 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hmn8"] Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.480566 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5rxc\" (UniqueName: \"kubernetes.io/projected/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-kube-api-access-b5rxc\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.481136 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-catalog-content\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.481339 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-utilities\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.582808 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-catalog-content\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.583330 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-utilities\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.583490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-catalog-content\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.583691 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-utilities\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.584027 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5rxc\" (UniqueName: \"kubernetes.io/projected/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-kube-api-access-b5rxc\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.607634 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5rxc\" (UniqueName: \"kubernetes.io/projected/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-kube-api-access-b5rxc\") pod \"certified-operators-2hmn8\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:52 crc kubenswrapper[4773]: I0120 18:53:52.703294 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:53:53 crc kubenswrapper[4773]: I0120 18:53:53.143233 4773 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-gntwg" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="registry-server" probeResult="failure" output=< Jan 20 18:53:53 crc kubenswrapper[4773]: timeout: failed to connect service ":50051" within 1s Jan 20 18:53:53 crc kubenswrapper[4773]: > Jan 20 18:53:53 crc kubenswrapper[4773]: I0120 18:53:53.238829 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2hmn8"] Jan 20 18:53:53 crc kubenswrapper[4773]: W0120 18:53:53.249221 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8824f1f3_1188_4559_b46a_28cbcdb0cf7b.slice/crio-991cfc1ebcef6d273f93281decf3466d3ceef73363e8e1692e19d50222b0b725 WatchSource:0}: Error finding container 991cfc1ebcef6d273f93281decf3466d3ceef73363e8e1692e19d50222b0b725: Status 404 returned error can't find the container with id 991cfc1ebcef6d273f93281decf3466d3ceef73363e8e1692e19d50222b0b725 Jan 20 18:53:54 crc kubenswrapper[4773]: I0120 18:53:54.180882 4773 generic.go:334] "Generic (PLEG): container finished" podID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerID="e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e" exitCode=0 Jan 20 18:53:54 crc kubenswrapper[4773]: I0120 18:53:54.180986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerDied","Data":"e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e"} Jan 20 18:53:54 crc kubenswrapper[4773]: I0120 18:53:54.181603 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerStarted","Data":"991cfc1ebcef6d273f93281decf3466d3ceef73363e8e1692e19d50222b0b725"} Jan 20 18:53:55 crc kubenswrapper[4773]: I0120 18:53:55.192988 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerStarted","Data":"9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8"} Jan 20 18:53:56 crc kubenswrapper[4773]: I0120 18:53:56.290182 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:56 crc kubenswrapper[4773]: I0120 18:53:56.290236 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:56 crc kubenswrapper[4773]: I0120 18:53:56.333906 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:56 crc kubenswrapper[4773]: E0120 18:53:56.416110 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8824f1f3_1188_4559_b46a_28cbcdb0cf7b.slice/crio-conmon-9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8.scope\": RecentStats: unable to find data in memory cache]" Jan 20 18:53:57 crc kubenswrapper[4773]: I0120 18:53:57.210981 4773 generic.go:334] "Generic (PLEG): container finished" podID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerID="9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8" exitCode=0 Jan 20 18:53:57 crc kubenswrapper[4773]: I0120 18:53:57.211032 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerDied","Data":"9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8"} Jan 20 18:53:57 crc kubenswrapper[4773]: I0120 18:53:57.258530 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:58 crc kubenswrapper[4773]: I0120 18:53:58.738140 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krc65"] Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.230277 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerStarted","Data":"189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121"} Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.230466 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-krc65" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="registry-server" containerID="cri-o://bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2" gracePeriod=2 Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.260409 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2hmn8" podStartSLOduration=3.116416474 podStartE2EDuration="7.260384709s" podCreationTimestamp="2026-01-20 18:53:52 +0000 UTC" firstStartedPulling="2026-01-20 18:53:54.18425508 +0000 UTC m=+1427.106068154" lastFinishedPulling="2026-01-20 18:53:58.328223365 +0000 UTC m=+1431.250036389" observedRunningTime="2026-01-20 18:53:59.247769909 +0000 UTC m=+1432.169582953" watchObservedRunningTime="2026-01-20 18:53:59.260384709 +0000 UTC m=+1432.182197753" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.694264 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.861770 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-utilities\") pod \"68194968-898d-49f9-a430-4732bb8122d5\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.862091 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-catalog-content\") pod \"68194968-898d-49f9-a430-4732bb8122d5\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.862139 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhmsf\" (UniqueName: \"kubernetes.io/projected/68194968-898d-49f9-a430-4732bb8122d5-kube-api-access-mhmsf\") pod \"68194968-898d-49f9-a430-4732bb8122d5\" (UID: \"68194968-898d-49f9-a430-4732bb8122d5\") " Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.862982 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-utilities" (OuterVolumeSpecName: "utilities") pod "68194968-898d-49f9-a430-4732bb8122d5" (UID: "68194968-898d-49f9-a430-4732bb8122d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.867589 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68194968-898d-49f9-a430-4732bb8122d5-kube-api-access-mhmsf" (OuterVolumeSpecName: "kube-api-access-mhmsf") pod "68194968-898d-49f9-a430-4732bb8122d5" (UID: "68194968-898d-49f9-a430-4732bb8122d5"). InnerVolumeSpecName "kube-api-access-mhmsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.884757 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "68194968-898d-49f9-a430-4732bb8122d5" (UID: "68194968-898d-49f9-a430-4732bb8122d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.965409 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.965448 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhmsf\" (UniqueName: \"kubernetes.io/projected/68194968-898d-49f9-a430-4732bb8122d5-kube-api-access-mhmsf\") on node \"crc\" DevicePath \"\"" Jan 20 18:53:59 crc kubenswrapper[4773]: I0120 18:53:59.965461 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/68194968-898d-49f9-a430-4732bb8122d5-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.241309 4773 generic.go:334] "Generic (PLEG): container finished" podID="68194968-898d-49f9-a430-4732bb8122d5" containerID="bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2" exitCode=0 Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.241373 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krc65" event={"ID":"68194968-898d-49f9-a430-4732bb8122d5","Type":"ContainerDied","Data":"bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2"} Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.241402 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krc65" event={"ID":"68194968-898d-49f9-a430-4732bb8122d5","Type":"ContainerDied","Data":"030c4e3fa1061815a067dd4271885fdcb782aadd09e4a899d330c43028fd9dd4"} Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.241412 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krc65" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.241419 4773 scope.go:117] "RemoveContainer" containerID="bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.269099 4773 scope.go:117] "RemoveContainer" containerID="34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.297549 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krc65"] Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.298098 4773 scope.go:117] "RemoveContainer" containerID="091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.315492 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-krc65"] Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.328668 4773 scope.go:117] "RemoveContainer" containerID="bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2" Jan 20 18:54:00 crc kubenswrapper[4773]: E0120 18:54:00.329266 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2\": container with ID starting with bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2 not found: ID does not exist" containerID="bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.329303 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2"} err="failed to get container status \"bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2\": rpc error: code = NotFound desc = could not find container \"bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2\": container with ID starting with bee302114dc2b82244691ace06f5cac76d29115d97b4e97f2b3d40fb7e3006c2 not found: ID does not exist" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.329328 4773 scope.go:117] "RemoveContainer" containerID="34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1" Jan 20 18:54:00 crc kubenswrapper[4773]: E0120 18:54:00.329724 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1\": container with ID starting with 34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1 not found: ID does not exist" containerID="34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.329815 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1"} err="failed to get container status \"34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1\": rpc error: code = NotFound desc = could not find container \"34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1\": container with ID starting with 34d4115bbe35563402636b1d84e3aa746ef55eb91ed0790cfb5baa9e579a80c1 not found: ID does not exist" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.329869 4773 scope.go:117] "RemoveContainer" containerID="091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532" Jan 20 18:54:00 crc kubenswrapper[4773]: E0120 18:54:00.330392 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532\": container with ID starting with 091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532 not found: ID does not exist" containerID="091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532" Jan 20 18:54:00 crc kubenswrapper[4773]: I0120 18:54:00.330436 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532"} err="failed to get container status \"091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532\": rpc error: code = NotFound desc = could not find container \"091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532\": container with ID starting with 091899db383d0bf9b0c265dd761747d5b48744c1e205120e48d0600277ebc532 not found: ID does not exist" Jan 20 18:54:01 crc kubenswrapper[4773]: I0120 18:54:01.459712 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68194968-898d-49f9-a430-4732bb8122d5" path="/var/lib/kubelet/pods/68194968-898d-49f9-a430-4732bb8122d5/volumes" Jan 20 18:54:02 crc kubenswrapper[4773]: I0120 18:54:02.155442 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:54:02 crc kubenswrapper[4773]: I0120 18:54:02.207887 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:54:02 crc kubenswrapper[4773]: I0120 18:54:02.705115 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:54:02 crc kubenswrapper[4773]: I0120 18:54:02.705179 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:54:02 crc kubenswrapper[4773]: I0120 18:54:02.769202 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:54:03 crc kubenswrapper[4773]: I0120 18:54:03.328716 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.141332 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gntwg"] Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.141892 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-gntwg" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="registry-server" containerID="cri-o://f030360740024828909b92b92df5b441a156e4058ef2561aa7a4a7c78f240f40" gracePeriod=2 Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.287521 4773 generic.go:334] "Generic (PLEG): container finished" podID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerID="f030360740024828909b92b92df5b441a156e4058ef2561aa7a4a7c78f240f40" exitCode=0 Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.287596 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerDied","Data":"f030360740024828909b92b92df5b441a156e4058ef2561aa7a4a7c78f240f40"} Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.560882 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.755070 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlrt8\" (UniqueName: \"kubernetes.io/projected/6af7e52c-fffc-47ba-88de-3340d26e02d5-kube-api-access-rlrt8\") pod \"6af7e52c-fffc-47ba-88de-3340d26e02d5\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.755272 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-catalog-content\") pod \"6af7e52c-fffc-47ba-88de-3340d26e02d5\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.755320 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-utilities\") pod \"6af7e52c-fffc-47ba-88de-3340d26e02d5\" (UID: \"6af7e52c-fffc-47ba-88de-3340d26e02d5\") " Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.756147 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-utilities" (OuterVolumeSpecName: "utilities") pod "6af7e52c-fffc-47ba-88de-3340d26e02d5" (UID: "6af7e52c-fffc-47ba-88de-3340d26e02d5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.761863 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6af7e52c-fffc-47ba-88de-3340d26e02d5-kube-api-access-rlrt8" (OuterVolumeSpecName: "kube-api-access-rlrt8") pod "6af7e52c-fffc-47ba-88de-3340d26e02d5" (UID: "6af7e52c-fffc-47ba-88de-3340d26e02d5"). InnerVolumeSpecName "kube-api-access-rlrt8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.857656 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.857689 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlrt8\" (UniqueName: \"kubernetes.io/projected/6af7e52c-fffc-47ba-88de-3340d26e02d5-kube-api-access-rlrt8\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.869087 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6af7e52c-fffc-47ba-88de-3340d26e02d5" (UID: "6af7e52c-fffc-47ba-88de-3340d26e02d5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:54:04 crc kubenswrapper[4773]: I0120 18:54:04.959586 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6af7e52c-fffc-47ba-88de-3340d26e02d5-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.139465 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hmn8"] Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.297519 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-gntwg" event={"ID":"6af7e52c-fffc-47ba-88de-3340d26e02d5","Type":"ContainerDied","Data":"a28dca4f7f445301eea149b36d0aab07434e48fff601378c8c14b9f9e5ae5a54"} Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.297602 4773 scope.go:117] "RemoveContainer" containerID="f030360740024828909b92b92df5b441a156e4058ef2561aa7a4a7c78f240f40" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.297709 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2hmn8" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="registry-server" containerID="cri-o://189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121" gracePeriod=2 Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.300911 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-gntwg" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.318609 4773 scope.go:117] "RemoveContainer" containerID="76ed65a63e898ee4e204311b4b65b2b0d63ba5a71d524c3c22e45cfb49527ec0" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.337023 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-gntwg"] Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.346547 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-gntwg"] Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.355721 4773 scope.go:117] "RemoveContainer" containerID="669f605e449c86e68ef6da15acaef216606728ad06e5b08f31210322560a5191" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.460920 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" path="/var/lib/kubelet/pods/6af7e52c-fffc-47ba-88de-3340d26e02d5/volumes" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.724192 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.873140 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5rxc\" (UniqueName: \"kubernetes.io/projected/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-kube-api-access-b5rxc\") pod \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.873228 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-catalog-content\") pod \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.873338 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-utilities\") pod \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\" (UID: \"8824f1f3-1188-4559-b46a-28cbcdb0cf7b\") " Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.874170 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-utilities" (OuterVolumeSpecName: "utilities") pod "8824f1f3-1188-4559-b46a-28cbcdb0cf7b" (UID: "8824f1f3-1188-4559-b46a-28cbcdb0cf7b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.874542 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.878401 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-kube-api-access-b5rxc" (OuterVolumeSpecName: "kube-api-access-b5rxc") pod "8824f1f3-1188-4559-b46a-28cbcdb0cf7b" (UID: "8824f1f3-1188-4559-b46a-28cbcdb0cf7b"). InnerVolumeSpecName "kube-api-access-b5rxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.913325 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8824f1f3-1188-4559-b46a-28cbcdb0cf7b" (UID: "8824f1f3-1188-4559-b46a-28cbcdb0cf7b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.975698 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5rxc\" (UniqueName: \"kubernetes.io/projected/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-kube-api-access-b5rxc\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:05 crc kubenswrapper[4773]: I0120 18:54:05.975726 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8824f1f3-1188-4559-b46a-28cbcdb0cf7b-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.313207 4773 generic.go:334] "Generic (PLEG): container finished" podID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerID="189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121" exitCode=0 Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.313303 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerDied","Data":"189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121"} Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.313323 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2hmn8" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.313351 4773 scope.go:117] "RemoveContainer" containerID="189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.313337 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2hmn8" event={"ID":"8824f1f3-1188-4559-b46a-28cbcdb0cf7b","Type":"ContainerDied","Data":"991cfc1ebcef6d273f93281decf3466d3ceef73363e8e1692e19d50222b0b725"} Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.345562 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2hmn8"] Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.346567 4773 scope.go:117] "RemoveContainer" containerID="9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.353484 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2hmn8"] Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.395806 4773 scope.go:117] "RemoveContainer" containerID="e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.430563 4773 scope.go:117] "RemoveContainer" containerID="189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121" Jan 20 18:54:06 crc kubenswrapper[4773]: E0120 18:54:06.431096 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121\": container with ID starting with 189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121 not found: ID does not exist" containerID="189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.431137 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121"} err="failed to get container status \"189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121\": rpc error: code = NotFound desc = could not find container \"189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121\": container with ID starting with 189e0db5288e5e218671f4f89ee0002854d2d7dcb8712df6131eadc602a31121 not found: ID does not exist" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.431181 4773 scope.go:117] "RemoveContainer" containerID="9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8" Jan 20 18:54:06 crc kubenswrapper[4773]: E0120 18:54:06.431595 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8\": container with ID starting with 9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8 not found: ID does not exist" containerID="9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.431641 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8"} err="failed to get container status \"9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8\": rpc error: code = NotFound desc = could not find container \"9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8\": container with ID starting with 9960980eb3d96f8e854d37200a9beeb39d205e9268862f8e2ce297bdae78afb8 not found: ID does not exist" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.431658 4773 scope.go:117] "RemoveContainer" containerID="e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e" Jan 20 18:54:06 crc kubenswrapper[4773]: E0120 18:54:06.431945 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e\": container with ID starting with e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e not found: ID does not exist" containerID="e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e" Jan 20 18:54:06 crc kubenswrapper[4773]: I0120 18:54:06.431970 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e"} err="failed to get container status \"e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e\": rpc error: code = NotFound desc = could not find container \"e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e\": container with ID starting with e70c163a2b9e3672c5220898798ed5dbcb82fc53fbe5d63b2c1c2b23202f5a6e not found: ID does not exist" Jan 20 18:54:07 crc kubenswrapper[4773]: I0120 18:54:07.462613 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" path="/var/lib/kubelet/pods/8824f1f3-1188-4559-b46a-28cbcdb0cf7b/volumes" Jan 20 18:54:17 crc kubenswrapper[4773]: I0120 18:54:17.752555 4773 scope.go:117] "RemoveContainer" containerID="fe986dbc9aa7abb1946cbbaf36610eba367f6b9655e2f6cf2645119cfbe827cd" Jan 20 18:54:58 crc kubenswrapper[4773]: I0120 18:54:58.170137 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:54:58 crc kubenswrapper[4773]: I0120 18:54:58.170980 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:55:17 crc kubenswrapper[4773]: I0120 18:55:17.859826 4773 scope.go:117] "RemoveContainer" containerID="5cc155e36c13c5b618c2477be4ab590ab510095287b6b608b69635e7105f701d" Jan 20 18:55:17 crc kubenswrapper[4773]: I0120 18:55:17.886195 4773 scope.go:117] "RemoveContainer" containerID="cec265096a69314b656c7eb565783999362cd58d7d21446746b6a8df723167e7" Jan 20 18:55:17 crc kubenswrapper[4773]: I0120 18:55:17.914471 4773 scope.go:117] "RemoveContainer" containerID="ad751c184e2d23886c2618122268c91712c2e78962210fabf377095ff3332826" Jan 20 18:55:17 crc kubenswrapper[4773]: I0120 18:55:17.962303 4773 scope.go:117] "RemoveContainer" containerID="fba20ba934791c753f7de5893c3aaef399510fc1a1206ee1163905e05a43e6b4" Jan 20 18:55:17 crc kubenswrapper[4773]: I0120 18:55:17.991269 4773 scope.go:117] "RemoveContainer" containerID="a3485916bab8c305597e5171f6d49c43821b0b823ee634ce2d2a67cfaa6f6ad9" Jan 20 18:55:28 crc kubenswrapper[4773]: I0120 18:55:28.172107 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:55:28 crc kubenswrapper[4773]: I0120 18:55:28.172913 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:55:58 crc kubenswrapper[4773]: I0120 18:55:58.169919 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 18:55:58 crc kubenswrapper[4773]: I0120 18:55:58.170562 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 18:55:58 crc kubenswrapper[4773]: I0120 18:55:58.170610 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 18:55:58 crc kubenswrapper[4773]: I0120 18:55:58.171432 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 18:55:58 crc kubenswrapper[4773]: I0120 18:55:58.171493 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" gracePeriod=600 Jan 20 18:55:58 crc kubenswrapper[4773]: E0120 18:55:58.293592 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:55:59 crc kubenswrapper[4773]: I0120 18:55:59.256056 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" exitCode=0 Jan 20 18:55:59 crc kubenswrapper[4773]: I0120 18:55:59.256177 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630"} Jan 20 18:55:59 crc kubenswrapper[4773]: I0120 18:55:59.256497 4773 scope.go:117] "RemoveContainer" containerID="14437c4854d46fdc109569c8299e656b31c4aa4992133183b5fa2d3fd5cee7bb" Jan 20 18:55:59 crc kubenswrapper[4773]: I0120 18:55:59.257058 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:55:59 crc kubenswrapper[4773]: E0120 18:55:59.257407 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:56:12 crc kubenswrapper[4773]: I0120 18:56:12.447707 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:56:12 crc kubenswrapper[4773]: E0120 18:56:12.448481 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:56:24 crc kubenswrapper[4773]: I0120 18:56:24.447222 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:56:24 crc kubenswrapper[4773]: E0120 18:56:24.449243 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.625599 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-bph6d"] Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.627723 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="extract-utilities" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.628153 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="extract-utilities" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.628239 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.628296 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.628373 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="extract-utilities" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.628432 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="extract-utilities" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.628493 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="extract-utilities" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.628549 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="extract-utilities" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.628614 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.628666 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.628726 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="extract-content" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.628858 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="extract-content" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.628920 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.629006 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.629070 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="extract-content" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.629141 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="extract-content" Jan 20 18:56:32 crc kubenswrapper[4773]: E0120 18:56:32.629204 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="extract-content" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.629265 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="extract-content" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.629523 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6af7e52c-fffc-47ba-88de-3340d26e02d5" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.629589 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8824f1f3-1188-4559-b46a-28cbcdb0cf7b" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.629647 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="68194968-898d-49f9-a430-4732bb8122d5" containerName="registry-server" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.631273 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.655478 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bph6d"] Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.773842 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-catalog-content\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.774087 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8pd9\" (UniqueName: \"kubernetes.io/projected/c7936be6-fe56-42d1-a86d-c2d3dd3718df-kube-api-access-t8pd9\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.774202 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-utilities\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.875404 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8pd9\" (UniqueName: \"kubernetes.io/projected/c7936be6-fe56-42d1-a86d-c2d3dd3718df-kube-api-access-t8pd9\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.875475 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-utilities\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.875541 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-catalog-content\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.876010 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-catalog-content\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.876177 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-utilities\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.895743 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8pd9\" (UniqueName: \"kubernetes.io/projected/c7936be6-fe56-42d1-a86d-c2d3dd3718df-kube-api-access-t8pd9\") pod \"community-operators-bph6d\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:32 crc kubenswrapper[4773]: I0120 18:56:32.955952 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:33 crc kubenswrapper[4773]: I0120 18:56:33.444922 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-bph6d"] Jan 20 18:56:33 crc kubenswrapper[4773]: I0120 18:56:33.535704 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bph6d" event={"ID":"c7936be6-fe56-42d1-a86d-c2d3dd3718df","Type":"ContainerStarted","Data":"c54f0307e49dba271d8b05075aa14d69a2b8b883abaff80426c0b2b66837b5eb"} Jan 20 18:56:34 crc kubenswrapper[4773]: I0120 18:56:34.560978 4773 generic.go:334] "Generic (PLEG): container finished" podID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerID="37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f" exitCode=0 Jan 20 18:56:34 crc kubenswrapper[4773]: I0120 18:56:34.561323 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bph6d" event={"ID":"c7936be6-fe56-42d1-a86d-c2d3dd3718df","Type":"ContainerDied","Data":"37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f"} Jan 20 18:56:35 crc kubenswrapper[4773]: I0120 18:56:35.446944 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:56:35 crc kubenswrapper[4773]: E0120 18:56:35.447452 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:56:38 crc kubenswrapper[4773]: I0120 18:56:38.597659 4773 generic.go:334] "Generic (PLEG): container finished" podID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerID="08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a" exitCode=0 Jan 20 18:56:38 crc kubenswrapper[4773]: I0120 18:56:38.597732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bph6d" event={"ID":"c7936be6-fe56-42d1-a86d-c2d3dd3718df","Type":"ContainerDied","Data":"08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a"} Jan 20 18:56:39 crc kubenswrapper[4773]: I0120 18:56:39.607270 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bph6d" event={"ID":"c7936be6-fe56-42d1-a86d-c2d3dd3718df","Type":"ContainerStarted","Data":"36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812"} Jan 20 18:56:39 crc kubenswrapper[4773]: I0120 18:56:39.629701 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-bph6d" podStartSLOduration=3.08771012 podStartE2EDuration="7.629680452s" podCreationTimestamp="2026-01-20 18:56:32 +0000 UTC" firstStartedPulling="2026-01-20 18:56:34.564289489 +0000 UTC m=+1587.486102513" lastFinishedPulling="2026-01-20 18:56:39.106259821 +0000 UTC m=+1592.028072845" observedRunningTime="2026-01-20 18:56:39.625051792 +0000 UTC m=+1592.546864816" watchObservedRunningTime="2026-01-20 18:56:39.629680452 +0000 UTC m=+1592.551493466" Jan 20 18:56:42 crc kubenswrapper[4773]: I0120 18:56:42.956112 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:42 crc kubenswrapper[4773]: I0120 18:56:42.956769 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:43 crc kubenswrapper[4773]: I0120 18:56:43.017874 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:46 crc kubenswrapper[4773]: I0120 18:56:46.447137 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:56:46 crc kubenswrapper[4773]: E0120 18:56:46.447816 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:56:53 crc kubenswrapper[4773]: I0120 18:56:53.003285 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:53 crc kubenswrapper[4773]: I0120 18:56:53.046918 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bph6d"] Jan 20 18:56:53 crc kubenswrapper[4773]: I0120 18:56:53.750312 4773 generic.go:334] "Generic (PLEG): container finished" podID="6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" containerID="ed1b733b25627aeb8a223630419832be2d7010197f1a7fbda2c35be60b1cd9f1" exitCode=0 Jan 20 18:56:53 crc kubenswrapper[4773]: I0120 18:56:53.750566 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-bph6d" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="registry-server" containerID="cri-o://36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812" gracePeriod=2 Jan 20 18:56:53 crc kubenswrapper[4773]: I0120 18:56:53.750678 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" event={"ID":"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f","Type":"ContainerDied","Data":"ed1b733b25627aeb8a223630419832be2d7010197f1a7fbda2c35be60b1cd9f1"} Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.216159 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.411672 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-catalog-content\") pod \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.412017 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t8pd9\" (UniqueName: \"kubernetes.io/projected/c7936be6-fe56-42d1-a86d-c2d3dd3718df-kube-api-access-t8pd9\") pod \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.412218 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-utilities\") pod \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\" (UID: \"c7936be6-fe56-42d1-a86d-c2d3dd3718df\") " Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.413568 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-utilities" (OuterVolumeSpecName: "utilities") pod "c7936be6-fe56-42d1-a86d-c2d3dd3718df" (UID: "c7936be6-fe56-42d1-a86d-c2d3dd3718df"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.419160 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7936be6-fe56-42d1-a86d-c2d3dd3718df-kube-api-access-t8pd9" (OuterVolumeSpecName: "kube-api-access-t8pd9") pod "c7936be6-fe56-42d1-a86d-c2d3dd3718df" (UID: "c7936be6-fe56-42d1-a86d-c2d3dd3718df"). InnerVolumeSpecName "kube-api-access-t8pd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.461812 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c7936be6-fe56-42d1-a86d-c2d3dd3718df" (UID: "c7936be6-fe56-42d1-a86d-c2d3dd3718df"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.515526 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.515560 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t8pd9\" (UniqueName: \"kubernetes.io/projected/c7936be6-fe56-42d1-a86d-c2d3dd3718df-kube-api-access-t8pd9\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.515576 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c7936be6-fe56-42d1-a86d-c2d3dd3718df-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.760149 4773 generic.go:334] "Generic (PLEG): container finished" podID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerID="36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812" exitCode=0 Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.760207 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-bph6d" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.760203 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bph6d" event={"ID":"c7936be6-fe56-42d1-a86d-c2d3dd3718df","Type":"ContainerDied","Data":"36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812"} Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.760663 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-bph6d" event={"ID":"c7936be6-fe56-42d1-a86d-c2d3dd3718df","Type":"ContainerDied","Data":"c54f0307e49dba271d8b05075aa14d69a2b8b883abaff80426c0b2b66837b5eb"} Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.760692 4773 scope.go:117] "RemoveContainer" containerID="36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.784437 4773 scope.go:117] "RemoveContainer" containerID="08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.807555 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-bph6d"] Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.818716 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-bph6d"] Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.823671 4773 scope.go:117] "RemoveContainer" containerID="37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.864989 4773 scope.go:117] "RemoveContainer" containerID="36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812" Jan 20 18:56:54 crc kubenswrapper[4773]: E0120 18:56:54.865476 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812\": container with ID starting with 36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812 not found: ID does not exist" containerID="36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.865521 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812"} err="failed to get container status \"36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812\": rpc error: code = NotFound desc = could not find container \"36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812\": container with ID starting with 36c4b00e430acfa0db6b9f5ce668b0c0f3ddfef09ce67cf4b8a072fb5347d812 not found: ID does not exist" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.865549 4773 scope.go:117] "RemoveContainer" containerID="08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a" Jan 20 18:56:54 crc kubenswrapper[4773]: E0120 18:56:54.865852 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a\": container with ID starting with 08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a not found: ID does not exist" containerID="08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.865879 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a"} err="failed to get container status \"08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a\": rpc error: code = NotFound desc = could not find container \"08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a\": container with ID starting with 08375e3a81fd1ce3e8bdd550f0380b75cef7aa4b2d83d95212a6c597c2a7003a not found: ID does not exist" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.865899 4773 scope.go:117] "RemoveContainer" containerID="37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f" Jan 20 18:56:54 crc kubenswrapper[4773]: E0120 18:56:54.866141 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f\": container with ID starting with 37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f not found: ID does not exist" containerID="37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f" Jan 20 18:56:54 crc kubenswrapper[4773]: I0120 18:56:54.866166 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f"} err="failed to get container status \"37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f\": rpc error: code = NotFound desc = could not find container \"37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f\": container with ID starting with 37bf4d31406289396b71463a0eb19060b79bf7d0b3f46f6fd965131039c6795f not found: ID does not exist" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.135829 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.330145 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-bootstrap-combined-ca-bundle\") pod \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.330318 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-inventory\") pod \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.330394 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nrx7x\" (UniqueName: \"kubernetes.io/projected/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-kube-api-access-nrx7x\") pod \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.330436 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-ssh-key-openstack-edpm-ipam\") pod \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\" (UID: \"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f\") " Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.335799 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" (UID: "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.336129 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-kube-api-access-nrx7x" (OuterVolumeSpecName: "kube-api-access-nrx7x") pod "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" (UID: "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f"). InnerVolumeSpecName "kube-api-access-nrx7x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.358693 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" (UID: "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.360674 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-inventory" (OuterVolumeSpecName: "inventory") pod "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" (UID: "6fa7427c-5ab8-4d6d-b81e-999ba155ae7f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.432691 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nrx7x\" (UniqueName: \"kubernetes.io/projected/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-kube-api-access-nrx7x\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.432725 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.432734 4773 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.432745 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.456287 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" path="/var/lib/kubelet/pods/c7936be6-fe56-42d1-a86d-c2d3dd3718df/volumes" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.772205 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.772206 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8" event={"ID":"6fa7427c-5ab8-4d6d-b81e-999ba155ae7f","Type":"ContainerDied","Data":"b53c5213d4b438728dd937da04ea956d9066ef8724ab70bbeae3ff997481a9d6"} Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.772341 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b53c5213d4b438728dd937da04ea956d9066ef8724ab70bbeae3ff997481a9d6" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.895520 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt"] Jan 20 18:56:55 crc kubenswrapper[4773]: E0120 18:56:55.896225 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="extract-content" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.896244 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="extract-content" Jan 20 18:56:55 crc kubenswrapper[4773]: E0120 18:56:55.896265 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="extract-utilities" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.896273 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="extract-utilities" Jan 20 18:56:55 crc kubenswrapper[4773]: E0120 18:56:55.896290 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.896300 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 20 18:56:55 crc kubenswrapper[4773]: E0120 18:56:55.896344 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="registry-server" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.896353 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="registry-server" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.896747 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.896788 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7936be6-fe56-42d1-a86d-c2d3dd3718df" containerName="registry-server" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.897759 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.903954 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt"] Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.904123 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.904339 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.904578 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.904786 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.947157 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.947271 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:55 crc kubenswrapper[4773]: I0120 18:56:55.947515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxp45\" (UniqueName: \"kubernetes.io/projected/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-kube-api-access-gxp45\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.049122 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.049210 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxp45\" (UniqueName: \"kubernetes.io/projected/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-kube-api-access-gxp45\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.049279 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.053134 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.053888 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.065520 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxp45\" (UniqueName: \"kubernetes.io/projected/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-kube-api-access-gxp45\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-87xkt\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.251564 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.769156 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt"] Jan 20 18:56:56 crc kubenswrapper[4773]: I0120 18:56:56.781299 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" event={"ID":"0dd5218f-c5ee-4e0b-83bb-ab17d1887596","Type":"ContainerStarted","Data":"1625767637a46e5f2b3baddec627eb33e35f3d60f07f8dd4d48dfcd1b431dfef"} Jan 20 18:56:57 crc kubenswrapper[4773]: I0120 18:56:57.453466 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:56:57 crc kubenswrapper[4773]: E0120 18:56:57.453737 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:56:57 crc kubenswrapper[4773]: I0120 18:56:57.790867 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" event={"ID":"0dd5218f-c5ee-4e0b-83bb-ab17d1887596","Type":"ContainerStarted","Data":"a3d1b2b32625905740058e821f111f168782e4d219803ce3813dd2b812563a2d"} Jan 20 18:56:57 crc kubenswrapper[4773]: I0120 18:56:57.812978 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" podStartSLOduration=2.049306407 podStartE2EDuration="2.812961272s" podCreationTimestamp="2026-01-20 18:56:55 +0000 UTC" firstStartedPulling="2026-01-20 18:56:56.77030715 +0000 UTC m=+1609.692120164" lastFinishedPulling="2026-01-20 18:56:57.533962005 +0000 UTC m=+1610.455775029" observedRunningTime="2026-01-20 18:56:57.808700591 +0000 UTC m=+1610.730513625" watchObservedRunningTime="2026-01-20 18:56:57.812961272 +0000 UTC m=+1610.734774296" Jan 20 18:57:10 crc kubenswrapper[4773]: I0120 18:57:10.446484 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:57:10 crc kubenswrapper[4773]: E0120 18:57:10.447194 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:57:24 crc kubenswrapper[4773]: I0120 18:57:24.447234 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:57:24 crc kubenswrapper[4773]: E0120 18:57:24.448046 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:57:35 crc kubenswrapper[4773]: I0120 18:57:35.448307 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:57:35 crc kubenswrapper[4773]: E0120 18:57:35.449077 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:57:49 crc kubenswrapper[4773]: I0120 18:57:49.448634 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:57:49 crc kubenswrapper[4773]: E0120 18:57:49.449840 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:58:02 crc kubenswrapper[4773]: I0120 18:58:02.446866 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:58:02 crc kubenswrapper[4773]: E0120 18:58:02.447714 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:58:08 crc kubenswrapper[4773]: I0120 18:58:08.036280 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-xjqwr"] Jan 20 18:58:08 crc kubenswrapper[4773]: I0120 18:58:08.045602 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-xjqwr"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.046898 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-18ee-account-create-update-llcxn"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.059335 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-8pd22"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.077606 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-18ee-account-create-update-llcxn"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.087366 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8bf57"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.094406 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8bf57"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.101204 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-8pd22"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.108539 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-f756-account-create-update-tlxkm"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.115583 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-fb33-account-create-update-2nkdm"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.122768 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-f756-account-create-update-tlxkm"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.129613 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-fb33-account-create-update-2nkdm"] Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.466298 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c64cf4d-562e-4a78-a22b-d682436d5db3" path="/var/lib/kubelet/pods/0c64cf4d-562e-4a78-a22b-d682436d5db3/volumes" Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.467806 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce8f955-26cb-4860-afc1-effceac1d7a4" path="/var/lib/kubelet/pods/2ce8f955-26cb-4860-afc1-effceac1d7a4/volumes" Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.469853 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea" path="/var/lib/kubelet/pods/30e17cf6-4b43-4ac5-bc0f-0b7850ad04ea/volumes" Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.471464 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484e46fc-ebda-496a-9884-295fcd065e9b" path="/var/lib/kubelet/pods/484e46fc-ebda-496a-9884-295fcd065e9b/volumes" Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.474556 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb" path="/var/lib/kubelet/pods/ba7d65a8-afba-4e1a-a7e6-b9483c97fdcb/volumes" Jan 20 18:58:09 crc kubenswrapper[4773]: I0120 18:58:09.475770 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6e707f5-41a8-43c6-976a-7a9645c0b0ca" path="/var/lib/kubelet/pods/c6e707f5-41a8-43c6-976a-7a9645c0b0ca/volumes" Jan 20 18:58:15 crc kubenswrapper[4773]: I0120 18:58:15.126906 4773 generic.go:334] "Generic (PLEG): container finished" podID="0dd5218f-c5ee-4e0b-83bb-ab17d1887596" containerID="a3d1b2b32625905740058e821f111f168782e4d219803ce3813dd2b812563a2d" exitCode=0 Jan 20 18:58:15 crc kubenswrapper[4773]: I0120 18:58:15.126986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" event={"ID":"0dd5218f-c5ee-4e0b-83bb-ab17d1887596","Type":"ContainerDied","Data":"a3d1b2b32625905740058e821f111f168782e4d219803ce3813dd2b812563a2d"} Jan 20 18:58:15 crc kubenswrapper[4773]: I0120 18:58:15.447051 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:58:15 crc kubenswrapper[4773]: E0120 18:58:15.452752 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.562742 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.658984 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-inventory\") pod \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.659065 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxp45\" (UniqueName: \"kubernetes.io/projected/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-kube-api-access-gxp45\") pod \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.659163 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-ssh-key-openstack-edpm-ipam\") pod \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\" (UID: \"0dd5218f-c5ee-4e0b-83bb-ab17d1887596\") " Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.668186 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-kube-api-access-gxp45" (OuterVolumeSpecName: "kube-api-access-gxp45") pod "0dd5218f-c5ee-4e0b-83bb-ab17d1887596" (UID: "0dd5218f-c5ee-4e0b-83bb-ab17d1887596"). InnerVolumeSpecName "kube-api-access-gxp45". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.688312 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-inventory" (OuterVolumeSpecName: "inventory") pod "0dd5218f-c5ee-4e0b-83bb-ab17d1887596" (UID: "0dd5218f-c5ee-4e0b-83bb-ab17d1887596"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.688705 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0dd5218f-c5ee-4e0b-83bb-ab17d1887596" (UID: "0dd5218f-c5ee-4e0b-83bb-ab17d1887596"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.760306 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.760344 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 18:58:17 crc kubenswrapper[4773]: I0120 18:58:17.760353 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxp45\" (UniqueName: \"kubernetes.io/projected/0dd5218f-c5ee-4e0b-83bb-ab17d1887596-kube-api-access-gxp45\") on node \"crc\" DevicePath \"\"" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.165782 4773 scope.go:117] "RemoveContainer" containerID="2f81f4ca58be86ce8c8a188542774e148445a3fd02682f00bd51696f895c5fe9" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.196332 4773 scope.go:117] "RemoveContainer" containerID="c06ffe1452b6a2d6f74722f0b7b71c4f2ce4d5613dc36070b3f5358e09162f2f" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.242571 4773 scope.go:117] "RemoveContainer" containerID="bf0e9d7311fa3cf82ace95c60a06b7e7384341e3a3deaf1802d90fe3b93f2f6a" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.261635 4773 scope.go:117] "RemoveContainer" containerID="e203a211f93f27ff720239411e192a2a8202e1fdc890ba783dd23386fddbb4d9" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.262992 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" event={"ID":"0dd5218f-c5ee-4e0b-83bb-ab17d1887596","Type":"ContainerDied","Data":"1625767637a46e5f2b3baddec627eb33e35f3d60f07f8dd4d48dfcd1b431dfef"} Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.263020 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1625767637a46e5f2b3baddec627eb33e35f3d60f07f8dd4d48dfcd1b431dfef" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.263041 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.286379 4773 scope.go:117] "RemoveContainer" containerID="7780978bfbf7070fdc6e2326036f3be82707f66d864695cb582db2f78e403bd9" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.331639 4773 scope.go:117] "RemoveContainer" containerID="100cf16d899578784656d12cf1cb2bef2afdb76869ab58de6601f1ccfb0932d7" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.643392 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v"] Jan 20 18:58:18 crc kubenswrapper[4773]: E0120 18:58:18.643965 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0dd5218f-c5ee-4e0b-83bb-ab17d1887596" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.643978 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="0dd5218f-c5ee-4e0b-83bb-ab17d1887596" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.644189 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="0dd5218f-c5ee-4e0b-83bb-ab17d1887596" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.644761 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.651425 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v"] Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.692794 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.692838 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.693086 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.693388 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.795984 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.796309 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.796440 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdntp\" (UniqueName: \"kubernetes.io/projected/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-kube-api-access-wdntp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.897365 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdntp\" (UniqueName: \"kubernetes.io/projected/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-kube-api-access-wdntp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.898194 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.898322 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.902300 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.902773 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:18 crc kubenswrapper[4773]: I0120 18:58:18.915518 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdntp\" (UniqueName: \"kubernetes.io/projected/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-kube-api-access-wdntp\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:19 crc kubenswrapper[4773]: I0120 18:58:19.013443 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:19 crc kubenswrapper[4773]: I0120 18:58:19.514795 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v"] Jan 20 18:58:19 crc kubenswrapper[4773]: I0120 18:58:19.527217 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 18:58:20 crc kubenswrapper[4773]: I0120 18:58:20.282252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" event={"ID":"8f7fa4e8-571e-47fe-9e86-e83acb77eb77","Type":"ContainerStarted","Data":"214ad8a794229d4fa466b745ad6c2c5b25a4a6f01698090bc0fd588f1017d53b"} Jan 20 18:58:20 crc kubenswrapper[4773]: I0120 18:58:20.282592 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" event={"ID":"8f7fa4e8-571e-47fe-9e86-e83acb77eb77","Type":"ContainerStarted","Data":"876525de22f34cf7d9a438ff020ac1ffb25cb8490ee813204b2c6ecb76f7ee31"} Jan 20 18:58:20 crc kubenswrapper[4773]: I0120 18:58:20.300625 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" podStartSLOduration=1.791173396 podStartE2EDuration="2.3006069s" podCreationTimestamp="2026-01-20 18:58:18 +0000 UTC" firstStartedPulling="2026-01-20 18:58:19.527026757 +0000 UTC m=+1692.448839781" lastFinishedPulling="2026-01-20 18:58:20.036460261 +0000 UTC m=+1692.958273285" observedRunningTime="2026-01-20 18:58:20.299636647 +0000 UTC m=+1693.221449681" watchObservedRunningTime="2026-01-20 18:58:20.3006069 +0000 UTC m=+1693.222419924" Jan 20 18:58:23 crc kubenswrapper[4773]: I0120 18:58:23.037438 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-j46db"] Jan 20 18:58:23 crc kubenswrapper[4773]: I0120 18:58:23.044772 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-j46db"] Jan 20 18:58:23 crc kubenswrapper[4773]: I0120 18:58:23.470273 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f5455e9-7072-4154-b881-75a1da2c0466" path="/var/lib/kubelet/pods/7f5455e9-7072-4154-b881-75a1da2c0466/volumes" Jan 20 18:58:25 crc kubenswrapper[4773]: I0120 18:58:25.323296 4773 generic.go:334] "Generic (PLEG): container finished" podID="8f7fa4e8-571e-47fe-9e86-e83acb77eb77" containerID="214ad8a794229d4fa466b745ad6c2c5b25a4a6f01698090bc0fd588f1017d53b" exitCode=0 Jan 20 18:58:25 crc kubenswrapper[4773]: I0120 18:58:25.323391 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" event={"ID":"8f7fa4e8-571e-47fe-9e86-e83acb77eb77","Type":"ContainerDied","Data":"214ad8a794229d4fa466b745ad6c2c5b25a4a6f01698090bc0fd588f1017d53b"} Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.724315 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.758027 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-ssh-key-openstack-edpm-ipam\") pod \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.758371 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-inventory\") pod \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.758483 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wdntp\" (UniqueName: \"kubernetes.io/projected/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-kube-api-access-wdntp\") pod \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\" (UID: \"8f7fa4e8-571e-47fe-9e86-e83acb77eb77\") " Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.764114 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-kube-api-access-wdntp" (OuterVolumeSpecName: "kube-api-access-wdntp") pod "8f7fa4e8-571e-47fe-9e86-e83acb77eb77" (UID: "8f7fa4e8-571e-47fe-9e86-e83acb77eb77"). InnerVolumeSpecName "kube-api-access-wdntp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.783758 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "8f7fa4e8-571e-47fe-9e86-e83acb77eb77" (UID: "8f7fa4e8-571e-47fe-9e86-e83acb77eb77"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.785558 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-inventory" (OuterVolumeSpecName: "inventory") pod "8f7fa4e8-571e-47fe-9e86-e83acb77eb77" (UID: "8f7fa4e8-571e-47fe-9e86-e83acb77eb77"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.860425 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.860455 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wdntp\" (UniqueName: \"kubernetes.io/projected/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-kube-api-access-wdntp\") on node \"crc\" DevicePath \"\"" Jan 20 18:58:26 crc kubenswrapper[4773]: I0120 18:58:26.860465 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/8f7fa4e8-571e-47fe-9e86-e83acb77eb77-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.340105 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" event={"ID":"8f7fa4e8-571e-47fe-9e86-e83acb77eb77","Type":"ContainerDied","Data":"876525de22f34cf7d9a438ff020ac1ffb25cb8490ee813204b2c6ecb76f7ee31"} Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.340367 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="876525de22f34cf7d9a438ff020ac1ffb25cb8490ee813204b2c6ecb76f7ee31" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.340196 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.432488 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw"] Jan 20 18:58:27 crc kubenswrapper[4773]: E0120 18:58:27.432894 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f7fa4e8-571e-47fe-9e86-e83acb77eb77" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.432916 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f7fa4e8-571e-47fe-9e86-e83acb77eb77" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.433169 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f7fa4e8-571e-47fe-9e86-e83acb77eb77" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.433837 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.440399 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.440462 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.440503 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.440848 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.465626 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw"] Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.572457 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp2ft\" (UniqueName: \"kubernetes.io/projected/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-kube-api-access-rp2ft\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.572528 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.572599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.674894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp2ft\" (UniqueName: \"kubernetes.io/projected/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-kube-api-access-rp2ft\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.675355 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.675731 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.681534 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.684970 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.690777 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp2ft\" (UniqueName: \"kubernetes.io/projected/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-kube-api-access-rp2ft\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-dzlzw\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:27 crc kubenswrapper[4773]: I0120 18:58:27.752977 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:58:28 crc kubenswrapper[4773]: I0120 18:58:28.237030 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw"] Jan 20 18:58:28 crc kubenswrapper[4773]: I0120 18:58:28.348574 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" event={"ID":"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf","Type":"ContainerStarted","Data":"1497b09b7f36c51b1b23fc5cdc948a4900c4421f77e9b7565a5d483bfe19b9e6"} Jan 20 18:58:29 crc kubenswrapper[4773]: I0120 18:58:29.357276 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" event={"ID":"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf","Type":"ContainerStarted","Data":"275b7cfb99fe82d1a980bcea4ca6dc36e2c577eb26c2063aabdcc1bc630d378d"} Jan 20 18:58:29 crc kubenswrapper[4773]: I0120 18:58:29.382889 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" podStartSLOduration=1.825537009 podStartE2EDuration="2.382863766s" podCreationTimestamp="2026-01-20 18:58:27 +0000 UTC" firstStartedPulling="2026-01-20 18:58:28.247673418 +0000 UTC m=+1701.169486442" lastFinishedPulling="2026-01-20 18:58:28.805000175 +0000 UTC m=+1701.726813199" observedRunningTime="2026-01-20 18:58:29.374179118 +0000 UTC m=+1702.295992152" watchObservedRunningTime="2026-01-20 18:58:29.382863766 +0000 UTC m=+1702.304676790" Jan 20 18:58:30 crc kubenswrapper[4773]: I0120 18:58:30.447664 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:58:30 crc kubenswrapper[4773]: E0120 18:58:30.448372 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:58:35 crc kubenswrapper[4773]: I0120 18:58:35.029248 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-29z4h"] Jan 20 18:58:35 crc kubenswrapper[4773]: I0120 18:58:35.036356 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-29z4h"] Jan 20 18:58:35 crc kubenswrapper[4773]: I0120 18:58:35.458460 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aa7530e2-53e5-4891-9a0e-ff23ee1c61bc" path="/var/lib/kubelet/pods/aa7530e2-53e5-4891-9a0e-ff23ee1c61bc/volumes" Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.034397 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-sg7w8"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.043318 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-fldlp"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.050405 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-5450-account-create-update-m7kr7"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.071137 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-jqhz4"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.079083 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-sg7w8"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.088707 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-5450-account-create-update-m7kr7"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.095541 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-fldlp"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.102730 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-jqhz4"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.110166 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-b6ae-account-create-update-xdwz2"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.117731 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-b6ae-account-create-update-xdwz2"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.124676 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-58b0-account-create-update-n4bl6"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.132008 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-58b0-account-create-update-n4bl6"] Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.458398 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="181581ac-d6d3-4700-bfb7-7179a262a27c" path="/var/lib/kubelet/pods/181581ac-d6d3-4700-bfb7-7179a262a27c/volumes" Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.459288 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2544d2a-4467-4356-9aee-21a75f6efedc" path="/var/lib/kubelet/pods/b2544d2a-4467-4356-9aee-21a75f6efedc/volumes" Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.459888 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b313ef44-3ec0-4e2e-bc88-0187cce26783" path="/var/lib/kubelet/pods/b313ef44-3ec0-4e2e-bc88-0187cce26783/volumes" Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.460431 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b742ea09-e1ce-4311-a9bf-7736d3ab235c" path="/var/lib/kubelet/pods/b742ea09-e1ce-4311-a9bf-7736d3ab235c/volumes" Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.461383 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="be215ecb-8014-4db1-8eac-59f0d3dee870" path="/var/lib/kubelet/pods/be215ecb-8014-4db1-8eac-59f0d3dee870/volumes" Jan 20 18:58:41 crc kubenswrapper[4773]: I0120 18:58:41.461853 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d813dade-efd1-404d-ae3f-ecea71ffb5ee" path="/var/lib/kubelet/pods/d813dade-efd1-404d-ae3f-ecea71ffb5ee/volumes" Jan 20 18:58:42 crc kubenswrapper[4773]: I0120 18:58:42.446863 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:58:42 crc kubenswrapper[4773]: E0120 18:58:42.447188 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:58:46 crc kubenswrapper[4773]: I0120 18:58:46.028570 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-kmlg7"] Jan 20 18:58:46 crc kubenswrapper[4773]: I0120 18:58:46.036578 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-kmlg7"] Jan 20 18:58:47 crc kubenswrapper[4773]: I0120 18:58:47.460707 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49d41a48-da79-4b93-bf84-ab8b94fed1c1" path="/var/lib/kubelet/pods/49d41a48-da79-4b93-bf84-ab8b94fed1c1/volumes" Jan 20 18:58:53 crc kubenswrapper[4773]: I0120 18:58:53.446695 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:58:53 crc kubenswrapper[4773]: E0120 18:58:53.447595 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:59:05 crc kubenswrapper[4773]: I0120 18:59:05.643890 4773 generic.go:334] "Generic (PLEG): container finished" podID="cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" containerID="275b7cfb99fe82d1a980bcea4ca6dc36e2c577eb26c2063aabdcc1bc630d378d" exitCode=0 Jan 20 18:59:05 crc kubenswrapper[4773]: I0120 18:59:05.644017 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" event={"ID":"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf","Type":"ContainerDied","Data":"275b7cfb99fe82d1a980bcea4ca6dc36e2c577eb26c2063aabdcc1bc630d378d"} Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.028354 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.176091 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-ssh-key-openstack-edpm-ipam\") pod \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.176230 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rp2ft\" (UniqueName: \"kubernetes.io/projected/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-kube-api-access-rp2ft\") pod \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.176317 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-inventory\") pod \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\" (UID: \"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf\") " Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.182235 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-kube-api-access-rp2ft" (OuterVolumeSpecName: "kube-api-access-rp2ft") pod "cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" (UID: "cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf"). InnerVolumeSpecName "kube-api-access-rp2ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.202197 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-inventory" (OuterVolumeSpecName: "inventory") pod "cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" (UID: "cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.202364 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" (UID: "cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.278332 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rp2ft\" (UniqueName: \"kubernetes.io/projected/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-kube-api-access-rp2ft\") on node \"crc\" DevicePath \"\"" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.278376 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.278395 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.453349 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:59:07 crc kubenswrapper[4773]: E0120 18:59:07.453590 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.659884 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" event={"ID":"cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf","Type":"ContainerDied","Data":"1497b09b7f36c51b1b23fc5cdc948a4900c4421f77e9b7565a5d483bfe19b9e6"} Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.660249 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1497b09b7f36c51b1b23fc5cdc948a4900c4421f77e9b7565a5d483bfe19b9e6" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.660029 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.746891 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj"] Jan 20 18:59:07 crc kubenswrapper[4773]: E0120 18:59:07.747295 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.747313 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.747475 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.748084 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.749957 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.750105 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.750342 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.750342 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.758225 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj"] Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.790654 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.790748 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4tj48\" (UniqueName: \"kubernetes.io/projected/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-kube-api-access-4tj48\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.790863 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.892305 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.892448 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.892484 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4tj48\" (UniqueName: \"kubernetes.io/projected/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-kube-api-access-4tj48\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.896887 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.901635 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:07 crc kubenswrapper[4773]: I0120 18:59:07.911045 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4tj48\" (UniqueName: \"kubernetes.io/projected/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-kube-api-access-4tj48\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:08 crc kubenswrapper[4773]: I0120 18:59:08.063785 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:08 crc kubenswrapper[4773]: I0120 18:59:08.565474 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj"] Jan 20 18:59:08 crc kubenswrapper[4773]: W0120 18:59:08.571243 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4be003e8_2c0f_45c8_944d_b126c8cbd1b0.slice/crio-d4bda17a1cf7ed048fdc09f93d0aedd21cd6485b98d1a853f457bf4a4b19fb3e WatchSource:0}: Error finding container d4bda17a1cf7ed048fdc09f93d0aedd21cd6485b98d1a853f457bf4a4b19fb3e: Status 404 returned error can't find the container with id d4bda17a1cf7ed048fdc09f93d0aedd21cd6485b98d1a853f457bf4a4b19fb3e Jan 20 18:59:08 crc kubenswrapper[4773]: I0120 18:59:08.668147 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" event={"ID":"4be003e8-2c0f-45c8-944d-b126c8cbd1b0","Type":"ContainerStarted","Data":"d4bda17a1cf7ed048fdc09f93d0aedd21cd6485b98d1a853f457bf4a4b19fb3e"} Jan 20 18:59:09 crc kubenswrapper[4773]: I0120 18:59:09.676567 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" event={"ID":"4be003e8-2c0f-45c8-944d-b126c8cbd1b0","Type":"ContainerStarted","Data":"d2a63070e513c4ff5a517b705dbd2671d5a58b1f4c876fcd619b5653b8ff1cdb"} Jan 20 18:59:09 crc kubenswrapper[4773]: I0120 18:59:09.695073 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" podStartSLOduration=2.13248792 podStartE2EDuration="2.695053573s" podCreationTimestamp="2026-01-20 18:59:07 +0000 UTC" firstStartedPulling="2026-01-20 18:59:08.5746347 +0000 UTC m=+1741.496447764" lastFinishedPulling="2026-01-20 18:59:09.137200373 +0000 UTC m=+1742.059013417" observedRunningTime="2026-01-20 18:59:09.687944473 +0000 UTC m=+1742.609757517" watchObservedRunningTime="2026-01-20 18:59:09.695053573 +0000 UTC m=+1742.616866597" Jan 20 18:59:13 crc kubenswrapper[4773]: E0120 18:59:13.220048 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27f5b0_829a_4c6b_851b_eb6e4cdd53bf.slice\": RecentStats: unable to find data in memory cache]" Jan 20 18:59:13 crc kubenswrapper[4773]: I0120 18:59:13.730525 4773 generic.go:334] "Generic (PLEG): container finished" podID="4be003e8-2c0f-45c8-944d-b126c8cbd1b0" containerID="d2a63070e513c4ff5a517b705dbd2671d5a58b1f4c876fcd619b5653b8ff1cdb" exitCode=0 Jan 20 18:59:13 crc kubenswrapper[4773]: I0120 18:59:13.730612 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" event={"ID":"4be003e8-2c0f-45c8-944d-b126c8cbd1b0","Type":"ContainerDied","Data":"d2a63070e513c4ff5a517b705dbd2671d5a58b1f4c876fcd619b5653b8ff1cdb"} Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.173922 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.328377 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-ssh-key-openstack-edpm-ipam\") pod \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.328465 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-inventory\") pod \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.328565 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4tj48\" (UniqueName: \"kubernetes.io/projected/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-kube-api-access-4tj48\") pod \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\" (UID: \"4be003e8-2c0f-45c8-944d-b126c8cbd1b0\") " Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.333823 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-kube-api-access-4tj48" (OuterVolumeSpecName: "kube-api-access-4tj48") pod "4be003e8-2c0f-45c8-944d-b126c8cbd1b0" (UID: "4be003e8-2c0f-45c8-944d-b126c8cbd1b0"). InnerVolumeSpecName "kube-api-access-4tj48". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.353844 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4be003e8-2c0f-45c8-944d-b126c8cbd1b0" (UID: "4be003e8-2c0f-45c8-944d-b126c8cbd1b0"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.355366 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-inventory" (OuterVolumeSpecName: "inventory") pod "4be003e8-2c0f-45c8-944d-b126c8cbd1b0" (UID: "4be003e8-2c0f-45c8-944d-b126c8cbd1b0"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.430810 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.430853 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.430862 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4tj48\" (UniqueName: \"kubernetes.io/projected/4be003e8-2c0f-45c8-944d-b126c8cbd1b0-kube-api-access-4tj48\") on node \"crc\" DevicePath \"\"" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.768319 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" event={"ID":"4be003e8-2c0f-45c8-944d-b126c8cbd1b0","Type":"ContainerDied","Data":"d4bda17a1cf7ed048fdc09f93d0aedd21cd6485b98d1a853f457bf4a4b19fb3e"} Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.768366 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4bda17a1cf7ed048fdc09f93d0aedd21cd6485b98d1a853f457bf4a4b19fb3e" Jan 20 18:59:15 crc kubenswrapper[4773]: I0120 18:59:15.768460 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.111846 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw"] Jan 20 18:59:16 crc kubenswrapper[4773]: E0120 18:59:16.112629 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4be003e8-2c0f-45c8-944d-b126c8cbd1b0" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.112654 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4be003e8-2c0f-45c8-944d-b126c8cbd1b0" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.112855 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4be003e8-2c0f-45c8-944d-b126c8cbd1b0" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.113621 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.116841 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.117030 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.119892 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.121572 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.125857 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw"] Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.142983 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zhtkx\" (UniqueName: \"kubernetes.io/projected/5f64745d-73ee-4219-b71f-b08d15f94f68-kube-api-access-zhtkx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.143298 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.143389 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.245388 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.245443 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.245558 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zhtkx\" (UniqueName: \"kubernetes.io/projected/5f64745d-73ee-4219-b71f-b08d15f94f68-kube-api-access-zhtkx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.251661 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.251674 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.264982 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zhtkx\" (UniqueName: \"kubernetes.io/projected/5f64745d-73ee-4219-b71f-b08d15f94f68-kube-api-access-zhtkx\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.433503 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 18:59:16 crc kubenswrapper[4773]: I0120 18:59:16.907325 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw"] Jan 20 18:59:17 crc kubenswrapper[4773]: I0120 18:59:17.790226 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" event={"ID":"5f64745d-73ee-4219-b71f-b08d15f94f68","Type":"ContainerStarted","Data":"c8f7b97714b45d74f4b227ab1c61259d6940c0dff27c49881ec4be2e3c0092a6"} Jan 20 18:59:17 crc kubenswrapper[4773]: I0120 18:59:17.790602 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" event={"ID":"5f64745d-73ee-4219-b71f-b08d15f94f68","Type":"ContainerStarted","Data":"0b6c753b91a3baa36eafa63cc8700d1d3a75165653080a4f8e99b47ca4a5d1da"} Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.436245 4773 scope.go:117] "RemoveContainer" containerID="c1e147f31c77c26d0387a0f9416a73c8299ce902514749792047df9c2fed6c5d" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.449109 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:59:18 crc kubenswrapper[4773]: E0120 18:59:18.449400 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.458981 4773 scope.go:117] "RemoveContainer" containerID="80a1ac29f0f0bb537005b053ab8c7c220780e4401eea32b234572cee7902d616" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.531827 4773 scope.go:117] "RemoveContainer" containerID="7ac50ea7174c7d5687e310f246288e88881d9a99f2dc7333211966358f9e13de" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.551049 4773 scope.go:117] "RemoveContainer" containerID="17b8b7c8cb845ba0251348f763aac9652f97d99f1d3fb0947416ad8e58f06104" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.566958 4773 scope.go:117] "RemoveContainer" containerID="19b2a1461c2e62cae82675b27637cd9300c36d95cb1554d31376193faaa94e3d" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.624340 4773 scope.go:117] "RemoveContainer" containerID="a3323f05de08ce0588343432edfe259822e3b8065981e020292a5a4f0e1cd649" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.689539 4773 scope.go:117] "RemoveContainer" containerID="59d2c461099d25c608c6562b9d212406fcc710ae864054a0764c29095622613a" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.705832 4773 scope.go:117] "RemoveContainer" containerID="3fe035cd5db85387fabc74e84605df50a425cce1a8ad3c3850fcd55fb4b1eaa6" Jan 20 18:59:18 crc kubenswrapper[4773]: I0120 18:59:18.747026 4773 scope.go:117] "RemoveContainer" containerID="a38683b5fdc91e39029015853b4b8d0f6b2a61a23721dce3500ed6e1d8bf2c84" Jan 20 18:59:21 crc kubenswrapper[4773]: I0120 18:59:21.032811 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" podStartSLOduration=5.392668197 podStartE2EDuration="6.032793948s" podCreationTimestamp="2026-01-20 18:59:15 +0000 UTC" firstStartedPulling="2026-01-20 18:59:16.919513665 +0000 UTC m=+1749.841326689" lastFinishedPulling="2026-01-20 18:59:17.559639416 +0000 UTC m=+1750.481452440" observedRunningTime="2026-01-20 18:59:17.809574912 +0000 UTC m=+1750.731387956" watchObservedRunningTime="2026-01-20 18:59:21.032793948 +0000 UTC m=+1753.954606972" Jan 20 18:59:21 crc kubenswrapper[4773]: I0120 18:59:21.039910 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-rz89h"] Jan 20 18:59:21 crc kubenswrapper[4773]: I0120 18:59:21.046557 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-rz89h"] Jan 20 18:59:21 crc kubenswrapper[4773]: I0120 18:59:21.456972 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec857182-f4b2-46cd-8b7f-fdbc443d8a1a" path="/var/lib/kubelet/pods/ec857182-f4b2-46cd-8b7f-fdbc443d8a1a/volumes" Jan 20 18:59:23 crc kubenswrapper[4773]: E0120 18:59:23.418974 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27f5b0_829a_4c6b_851b_eb6e4cdd53bf.slice\": RecentStats: unable to find data in memory cache]" Jan 20 18:59:27 crc kubenswrapper[4773]: I0120 18:59:27.028333 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-mhdc2"] Jan 20 18:59:27 crc kubenswrapper[4773]: I0120 18:59:27.038733 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-49f25"] Jan 20 18:59:27 crc kubenswrapper[4773]: I0120 18:59:27.048601 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-mhdc2"] Jan 20 18:59:27 crc kubenswrapper[4773]: I0120 18:59:27.060851 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-49f25"] Jan 20 18:59:27 crc kubenswrapper[4773]: I0120 18:59:27.460557 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0158a06a-bb30-4d75-904f-90a4c6307fd6" path="/var/lib/kubelet/pods/0158a06a-bb30-4d75-904f-90a4c6307fd6/volumes" Jan 20 18:59:27 crc kubenswrapper[4773]: I0120 18:59:27.461775 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17ca6753-a956-4078-8927-2f2a6c41cb80" path="/var/lib/kubelet/pods/17ca6753-a956-4078-8927-2f2a6c41cb80/volumes" Jan 20 18:59:30 crc kubenswrapper[4773]: I0120 18:59:30.447243 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:59:30 crc kubenswrapper[4773]: E0120 18:59:30.448676 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:59:33 crc kubenswrapper[4773]: E0120 18:59:33.655786 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27f5b0_829a_4c6b_851b_eb6e4cdd53bf.slice\": RecentStats: unable to find data in memory cache]" Jan 20 18:59:38 crc kubenswrapper[4773]: I0120 18:59:38.035680 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-z8p6p"] Jan 20 18:59:38 crc kubenswrapper[4773]: I0120 18:59:38.047698 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-z8p6p"] Jan 20 18:59:39 crc kubenswrapper[4773]: I0120 18:59:39.031211 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-22fkv"] Jan 20 18:59:39 crc kubenswrapper[4773]: I0120 18:59:39.040364 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-22fkv"] Jan 20 18:59:39 crc kubenswrapper[4773]: I0120 18:59:39.459149 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b" path="/var/lib/kubelet/pods/3afc66ba-ab48-4c4f-bc4a-86d6ef34d00b/volumes" Jan 20 18:59:39 crc kubenswrapper[4773]: I0120 18:59:39.460271 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9eee838-721f-48cc-a5aa-37644a62d846" path="/var/lib/kubelet/pods/d9eee838-721f-48cc-a5aa-37644a62d846/volumes" Jan 20 18:59:41 crc kubenswrapper[4773]: I0120 18:59:41.447290 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:59:41 crc kubenswrapper[4773]: E0120 18:59:41.447819 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 18:59:43 crc kubenswrapper[4773]: E0120 18:59:43.877691 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27f5b0_829a_4c6b_851b_eb6e4cdd53bf.slice\": RecentStats: unable to find data in memory cache]" Jan 20 18:59:54 crc kubenswrapper[4773]: E0120 18:59:54.075622 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27f5b0_829a_4c6b_851b_eb6e4cdd53bf.slice\": RecentStats: unable to find data in memory cache]" Jan 20 18:59:54 crc kubenswrapper[4773]: I0120 18:59:54.447589 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 18:59:54 crc kubenswrapper[4773]: E0120 18:59:54.448204 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.148668 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp"] Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.150246 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.152899 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.152914 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.165206 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp"] Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.282921 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-config-volume\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.283052 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btzj2\" (UniqueName: \"kubernetes.io/projected/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-kube-api-access-btzj2\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.283155 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-secret-volume\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.384788 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-config-volume\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.384857 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btzj2\" (UniqueName: \"kubernetes.io/projected/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-kube-api-access-btzj2\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.384954 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-secret-volume\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.385841 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-config-volume\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.393454 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-secret-volume\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.402549 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btzj2\" (UniqueName: \"kubernetes.io/projected/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-kube-api-access-btzj2\") pod \"collect-profiles-29482260-gckfp\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.521809 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:00 crc kubenswrapper[4773]: I0120 19:00:00.953628 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp"] Jan 20 19:00:01 crc kubenswrapper[4773]: I0120 19:00:01.162476 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" event={"ID":"48dbb315-9da1-4b84-9a8e-86448b7ce2bf","Type":"ContainerStarted","Data":"c5b2b46bfd57b06bfe31b87e36761ea26d02a9b817e679d1c4c790f56bcd1486"} Jan 20 19:00:01 crc kubenswrapper[4773]: I0120 19:00:01.162786 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" event={"ID":"48dbb315-9da1-4b84-9a8e-86448b7ce2bf","Type":"ContainerStarted","Data":"f4f949c7dc9bc44e44b29566d181d37819bd618f88c494896b787e0da39ec321"} Jan 20 19:00:01 crc kubenswrapper[4773]: I0120 19:00:01.180820 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" podStartSLOduration=1.180800462 podStartE2EDuration="1.180800462s" podCreationTimestamp="2026-01-20 19:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:00:01.176920958 +0000 UTC m=+1794.098733982" watchObservedRunningTime="2026-01-20 19:00:01.180800462 +0000 UTC m=+1794.102613506" Jan 20 19:00:02 crc kubenswrapper[4773]: I0120 19:00:02.184739 4773 generic.go:334] "Generic (PLEG): container finished" podID="48dbb315-9da1-4b84-9a8e-86448b7ce2bf" containerID="c5b2b46bfd57b06bfe31b87e36761ea26d02a9b817e679d1c4c790f56bcd1486" exitCode=0 Jan 20 19:00:02 crc kubenswrapper[4773]: I0120 19:00:02.184806 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" event={"ID":"48dbb315-9da1-4b84-9a8e-86448b7ce2bf","Type":"ContainerDied","Data":"c5b2b46bfd57b06bfe31b87e36761ea26d02a9b817e679d1c4c790f56bcd1486"} Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.496478 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.647378 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-config-volume\") pod \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.647689 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-secret-volume\") pod \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.647721 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btzj2\" (UniqueName: \"kubernetes.io/projected/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-kube-api-access-btzj2\") pod \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\" (UID: \"48dbb315-9da1-4b84-9a8e-86448b7ce2bf\") " Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.648097 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-config-volume" (OuterVolumeSpecName: "config-volume") pod "48dbb315-9da1-4b84-9a8e-86448b7ce2bf" (UID: "48dbb315-9da1-4b84-9a8e-86448b7ce2bf"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.648433 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.653455 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "48dbb315-9da1-4b84-9a8e-86448b7ce2bf" (UID: "48dbb315-9da1-4b84-9a8e-86448b7ce2bf"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.653511 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-kube-api-access-btzj2" (OuterVolumeSpecName: "kube-api-access-btzj2") pod "48dbb315-9da1-4b84-9a8e-86448b7ce2bf" (UID: "48dbb315-9da1-4b84-9a8e-86448b7ce2bf"). InnerVolumeSpecName "kube-api-access-btzj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.749881 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:03 crc kubenswrapper[4773]: I0120 19:00:03.749951 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btzj2\" (UniqueName: \"kubernetes.io/projected/48dbb315-9da1-4b84-9a8e-86448b7ce2bf-kube-api-access-btzj2\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:04 crc kubenswrapper[4773]: I0120 19:00:04.207439 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" event={"ID":"48dbb315-9da1-4b84-9a8e-86448b7ce2bf","Type":"ContainerDied","Data":"f4f949c7dc9bc44e44b29566d181d37819bd618f88c494896b787e0da39ec321"} Jan 20 19:00:04 crc kubenswrapper[4773]: I0120 19:00:04.207502 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4f949c7dc9bc44e44b29566d181d37819bd618f88c494896b787e0da39ec321" Jan 20 19:00:04 crc kubenswrapper[4773]: I0120 19:00:04.207511 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482260-gckfp" Jan 20 19:00:04 crc kubenswrapper[4773]: E0120 19:00:04.310771 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc27f5b0_829a_4c6b_851b_eb6e4cdd53bf.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48dbb315_9da1_4b84_9a8e_86448b7ce2bf.slice\": RecentStats: unable to find data in memory cache]" Jan 20 19:00:05 crc kubenswrapper[4773]: I0120 19:00:05.447231 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 19:00:05 crc kubenswrapper[4773]: E0120 19:00:05.447765 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:00:06 crc kubenswrapper[4773]: I0120 19:00:06.229528 4773 generic.go:334] "Generic (PLEG): container finished" podID="5f64745d-73ee-4219-b71f-b08d15f94f68" containerID="c8f7b97714b45d74f4b227ab1c61259d6940c0dff27c49881ec4be2e3c0092a6" exitCode=0 Jan 20 19:00:06 crc kubenswrapper[4773]: I0120 19:00:06.229609 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" event={"ID":"5f64745d-73ee-4219-b71f-b08d15f94f68","Type":"ContainerDied","Data":"c8f7b97714b45d74f4b227ab1c61259d6940c0dff27c49881ec4be2e3c0092a6"} Jan 20 19:00:07 crc kubenswrapper[4773]: E0120 19:00:07.435908 4773 info.go:109] Failed to get network devices: open /sys/class/net/0b6c753b91a3baa/address: no such file or directory Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.602683 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.722198 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-ssh-key-openstack-edpm-ipam\") pod \"5f64745d-73ee-4219-b71f-b08d15f94f68\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.722278 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-inventory\") pod \"5f64745d-73ee-4219-b71f-b08d15f94f68\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.722309 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zhtkx\" (UniqueName: \"kubernetes.io/projected/5f64745d-73ee-4219-b71f-b08d15f94f68-kube-api-access-zhtkx\") pod \"5f64745d-73ee-4219-b71f-b08d15f94f68\" (UID: \"5f64745d-73ee-4219-b71f-b08d15f94f68\") " Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.728709 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f64745d-73ee-4219-b71f-b08d15f94f68-kube-api-access-zhtkx" (OuterVolumeSpecName: "kube-api-access-zhtkx") pod "5f64745d-73ee-4219-b71f-b08d15f94f68" (UID: "5f64745d-73ee-4219-b71f-b08d15f94f68"). InnerVolumeSpecName "kube-api-access-zhtkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.746104 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-inventory" (OuterVolumeSpecName: "inventory") pod "5f64745d-73ee-4219-b71f-b08d15f94f68" (UID: "5f64745d-73ee-4219-b71f-b08d15f94f68"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.748229 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5f64745d-73ee-4219-b71f-b08d15f94f68" (UID: "5f64745d-73ee-4219-b71f-b08d15f94f68"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.824499 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.824536 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zhtkx\" (UniqueName: \"kubernetes.io/projected/5f64745d-73ee-4219-b71f-b08d15f94f68-kube-api-access-zhtkx\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:07 crc kubenswrapper[4773]: I0120 19:00:07.824549 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5f64745d-73ee-4219-b71f-b08d15f94f68-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.250335 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" event={"ID":"5f64745d-73ee-4219-b71f-b08d15f94f68","Type":"ContainerDied","Data":"0b6c753b91a3baa36eafa63cc8700d1d3a75165653080a4f8e99b47ca4a5d1da"} Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.250379 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0b6c753b91a3baa36eafa63cc8700d1d3a75165653080a4f8e99b47ca4a5d1da" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.250416 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.332288 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhxwd"] Jan 20 19:00:08 crc kubenswrapper[4773]: E0120 19:00:08.332730 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48dbb315-9da1-4b84-9a8e-86448b7ce2bf" containerName="collect-profiles" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.332747 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="48dbb315-9da1-4b84-9a8e-86448b7ce2bf" containerName="collect-profiles" Jan 20 19:00:08 crc kubenswrapper[4773]: E0120 19:00:08.332772 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f64745d-73ee-4219-b71f-b08d15f94f68" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.332780 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f64745d-73ee-4219-b71f-b08d15f94f68" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.332987 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f64745d-73ee-4219-b71f-b08d15f94f68" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.333008 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="48dbb315-9da1-4b84-9a8e-86448b7ce2bf" containerName="collect-profiles" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.333566 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.339223 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.339228 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.339427 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.347676 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhxwd"] Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.363690 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.432838 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.433151 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jvbpl\" (UniqueName: \"kubernetes.io/projected/4966c538-33c7-4d94-9705-0081ce04e9ef-kube-api-access-jvbpl\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.433810 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.535635 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.535990 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.536102 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jvbpl\" (UniqueName: \"kubernetes.io/projected/4966c538-33c7-4d94-9705-0081ce04e9ef-kube-api-access-jvbpl\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.540526 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.541633 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.553141 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jvbpl\" (UniqueName: \"kubernetes.io/projected/4966c538-33c7-4d94-9705-0081ce04e9ef-kube-api-access-jvbpl\") pod \"ssh-known-hosts-edpm-deployment-jhxwd\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:08 crc kubenswrapper[4773]: I0120 19:00:08.682350 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:09 crc kubenswrapper[4773]: I0120 19:00:09.224954 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhxwd"] Jan 20 19:00:09 crc kubenswrapper[4773]: I0120 19:00:09.260584 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" event={"ID":"4966c538-33c7-4d94-9705-0081ce04e9ef","Type":"ContainerStarted","Data":"4aed70e28a34cc771d471bb202c2fb3239ceb0f7ad21e660a68c198dd9fe8e18"} Jan 20 19:00:10 crc kubenswrapper[4773]: I0120 19:00:10.270426 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" event={"ID":"4966c538-33c7-4d94-9705-0081ce04e9ef","Type":"ContainerStarted","Data":"93a7eebb7e51a278bd173d39037d247ae953e7e362f3c91fbf6d9626e5cde308"} Jan 20 19:00:10 crc kubenswrapper[4773]: I0120 19:00:10.301918 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" podStartSLOduration=1.7794861050000002 podStartE2EDuration="2.301891022s" podCreationTimestamp="2026-01-20 19:00:08 +0000 UTC" firstStartedPulling="2026-01-20 19:00:09.23138112 +0000 UTC m=+1802.153194144" lastFinishedPulling="2026-01-20 19:00:09.753786037 +0000 UTC m=+1802.675599061" observedRunningTime="2026-01-20 19:00:10.28686696 +0000 UTC m=+1803.208680014" watchObservedRunningTime="2026-01-20 19:00:10.301891022 +0000 UTC m=+1803.223704076" Jan 20 19:00:16 crc kubenswrapper[4773]: I0120 19:00:16.040879 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-vphtt"] Jan 20 19:00:16 crc kubenswrapper[4773]: I0120 19:00:16.049283 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-vphtt"] Jan 20 19:00:16 crc kubenswrapper[4773]: I0120 19:00:16.317006 4773 generic.go:334] "Generic (PLEG): container finished" podID="4966c538-33c7-4d94-9705-0081ce04e9ef" containerID="93a7eebb7e51a278bd173d39037d247ae953e7e362f3c91fbf6d9626e5cde308" exitCode=0 Jan 20 19:00:16 crc kubenswrapper[4773]: I0120 19:00:16.317060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" event={"ID":"4966c538-33c7-4d94-9705-0081ce04e9ef","Type":"ContainerDied","Data":"93a7eebb7e51a278bd173d39037d247ae953e7e362f3c91fbf6d9626e5cde308"} Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.029842 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-f2d2j"] Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.039779 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-f2d2j"] Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.454456 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 19:00:17 crc kubenswrapper[4773]: E0120 19:00:17.455010 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.480392 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="833eac91-4269-4e1e-9923-8dd8ed2276dc" path="/var/lib/kubelet/pods/833eac91-4269-4e1e-9923-8dd8ed2276dc/volumes" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.480967 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86ae6a8c-2043-4e0f-a23f-43c998d3d9d7" path="/var/lib/kubelet/pods/86ae6a8c-2043-4e0f-a23f-43c998d3d9d7/volumes" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.701326 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.802322 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-ssh-key-openstack-edpm-ipam\") pod \"4966c538-33c7-4d94-9705-0081ce04e9ef\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.802409 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-inventory-0\") pod \"4966c538-33c7-4d94-9705-0081ce04e9ef\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.802434 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jvbpl\" (UniqueName: \"kubernetes.io/projected/4966c538-33c7-4d94-9705-0081ce04e9ef-kube-api-access-jvbpl\") pod \"4966c538-33c7-4d94-9705-0081ce04e9ef\" (UID: \"4966c538-33c7-4d94-9705-0081ce04e9ef\") " Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.817812 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4966c538-33c7-4d94-9705-0081ce04e9ef-kube-api-access-jvbpl" (OuterVolumeSpecName: "kube-api-access-jvbpl") pod "4966c538-33c7-4d94-9705-0081ce04e9ef" (UID: "4966c538-33c7-4d94-9705-0081ce04e9ef"). InnerVolumeSpecName "kube-api-access-jvbpl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.826709 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "4966c538-33c7-4d94-9705-0081ce04e9ef" (UID: "4966c538-33c7-4d94-9705-0081ce04e9ef"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.834684 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "4966c538-33c7-4d94-9705-0081ce04e9ef" (UID: "4966c538-33c7-4d94-9705-0081ce04e9ef"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.904879 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.904928 4773 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/4966c538-33c7-4d94-9705-0081ce04e9ef-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:17 crc kubenswrapper[4773]: I0120 19:00:17.904956 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jvbpl\" (UniqueName: \"kubernetes.io/projected/4966c538-33c7-4d94-9705-0081ce04e9ef-kube-api-access-jvbpl\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.034427 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-jgwdl"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.040411 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-jgwdl"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.047707 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-da16-account-create-update-nsb2n"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.055282 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-79ae-account-create-update-kgx64"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.063258 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-79ae-account-create-update-kgx64"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.069510 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-da16-account-create-update-nsb2n"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.076705 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-33c7-account-create-update-ck7lk"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.084007 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-33c7-account-create-update-ck7lk"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.334372 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" event={"ID":"4966c538-33c7-4d94-9705-0081ce04e9ef","Type":"ContainerDied","Data":"4aed70e28a34cc771d471bb202c2fb3239ceb0f7ad21e660a68c198dd9fe8e18"} Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.334414 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4aed70e28a34cc771d471bb202c2fb3239ceb0f7ad21e660a68c198dd9fe8e18" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.334471 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-jhxwd" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.409184 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66"] Jan 20 19:00:18 crc kubenswrapper[4773]: E0120 19:00:18.409664 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4966c538-33c7-4d94-9705-0081ce04e9ef" containerName="ssh-known-hosts-edpm-deployment" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.409685 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4966c538-33c7-4d94-9705-0081ce04e9ef" containerName="ssh-known-hosts-edpm-deployment" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.409901 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4966c538-33c7-4d94-9705-0081ce04e9ef" containerName="ssh-known-hosts-edpm-deployment" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.410617 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.414075 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.414323 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.414828 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.415024 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.430302 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66"] Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.516702 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.516771 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.516832 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lx7q\" (UniqueName: \"kubernetes.io/projected/3ef8874c-43f1-43c9-ac7c-0af15c430e89-kube-api-access-8lx7q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.618675 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.618765 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.618883 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lx7q\" (UniqueName: \"kubernetes.io/projected/3ef8874c-43f1-43c9-ac7c-0af15c430e89-kube-api-access-8lx7q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.623091 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.623260 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.638560 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lx7q\" (UniqueName: \"kubernetes.io/projected/3ef8874c-43f1-43c9-ac7c-0af15c430e89-kube-api-access-8lx7q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-khn66\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.730020 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.922317 4773 scope.go:117] "RemoveContainer" containerID="dff8a4c86d068e27a71833f5b56e122b543d819294820eb96fea705ee47ddabe" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.968350 4773 scope.go:117] "RemoveContainer" containerID="3321d4830a85f50c04166b887254333d17cf16fc0f4241d05645223c19fa5071" Jan 20 19:00:18 crc kubenswrapper[4773]: I0120 19:00:18.997364 4773 scope.go:117] "RemoveContainer" containerID="33a44114454454a182e314a103e4daecf90ecb7caed9c7572f0056b58d9567e3" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.015125 4773 scope.go:117] "RemoveContainer" containerID="bd53904198644fe406dce4ba7d96027169199c6966beb39d8a18ad50565e1374" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.049561 4773 scope.go:117] "RemoveContainer" containerID="1921c6e2c53d5b75992757a0b24e916241c80d607d6e20c9b6226a61ae867455" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.082114 4773 scope.go:117] "RemoveContainer" containerID="7358a078ad55ead91e8865f0bf5d80f239dd741b49799df9b4c64f0d2143c92a" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.097623 4773 scope.go:117] "RemoveContainer" containerID="afa34648f2d59f0f8d5c41b244e65ca128bc231f7b61f9cc13109a9287149c7d" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.235478 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66"] Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.341602 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" event={"ID":"3ef8874c-43f1-43c9-ac7c-0af15c430e89","Type":"ContainerStarted","Data":"20c019dd6dfc037936553ca8b8283032064012141a0a26a9b449b3f1e4c13847"} Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.457107 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bd3a449-dc14-46ca-8e19-64d0a282483e" path="/var/lib/kubelet/pods/2bd3a449-dc14-46ca-8e19-64d0a282483e/volumes" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.458032 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47dcb7c9-ffa7-46bc-b695-02aea6e679a1" path="/var/lib/kubelet/pods/47dcb7c9-ffa7-46bc-b695-02aea6e679a1/volumes" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.458586 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7455911e-a1ad-442b-97b9-362496066bbf" path="/var/lib/kubelet/pods/7455911e-a1ad-442b-97b9-362496066bbf/volumes" Jan 20 19:00:19 crc kubenswrapper[4773]: I0120 19:00:19.459126 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4f47b18-303f-415d-8bf8-c1f7a075b747" path="/var/lib/kubelet/pods/f4f47b18-303f-415d-8bf8-c1f7a075b747/volumes" Jan 20 19:00:20 crc kubenswrapper[4773]: I0120 19:00:20.365651 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" event={"ID":"3ef8874c-43f1-43c9-ac7c-0af15c430e89","Type":"ContainerStarted","Data":"8e156fca405116f5d486fe65823e78cd8f316ad936e8607162db09c90867f261"} Jan 20 19:00:20 crc kubenswrapper[4773]: I0120 19:00:20.391144 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" podStartSLOduration=1.90963208 podStartE2EDuration="2.391126191s" podCreationTimestamp="2026-01-20 19:00:18 +0000 UTC" firstStartedPulling="2026-01-20 19:00:19.242421047 +0000 UTC m=+1812.164234071" lastFinishedPulling="2026-01-20 19:00:19.723915158 +0000 UTC m=+1812.645728182" observedRunningTime="2026-01-20 19:00:20.38568994 +0000 UTC m=+1813.307502974" watchObservedRunningTime="2026-01-20 19:00:20.391126191 +0000 UTC m=+1813.312939215" Jan 20 19:00:27 crc kubenswrapper[4773]: I0120 19:00:27.429311 4773 generic.go:334] "Generic (PLEG): container finished" podID="3ef8874c-43f1-43c9-ac7c-0af15c430e89" containerID="8e156fca405116f5d486fe65823e78cd8f316ad936e8607162db09c90867f261" exitCode=0 Jan 20 19:00:27 crc kubenswrapper[4773]: I0120 19:00:27.429382 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" event={"ID":"3ef8874c-43f1-43c9-ac7c-0af15c430e89","Type":"ContainerDied","Data":"8e156fca405116f5d486fe65823e78cd8f316ad936e8607162db09c90867f261"} Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.799687 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.899289 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-inventory\") pod \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.899344 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8lx7q\" (UniqueName: \"kubernetes.io/projected/3ef8874c-43f1-43c9-ac7c-0af15c430e89-kube-api-access-8lx7q\") pod \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.899408 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-ssh-key-openstack-edpm-ipam\") pod \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\" (UID: \"3ef8874c-43f1-43c9-ac7c-0af15c430e89\") " Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.905211 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ef8874c-43f1-43c9-ac7c-0af15c430e89-kube-api-access-8lx7q" (OuterVolumeSpecName: "kube-api-access-8lx7q") pod "3ef8874c-43f1-43c9-ac7c-0af15c430e89" (UID: "3ef8874c-43f1-43c9-ac7c-0af15c430e89"). InnerVolumeSpecName "kube-api-access-8lx7q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.923518 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3ef8874c-43f1-43c9-ac7c-0af15c430e89" (UID: "3ef8874c-43f1-43c9-ac7c-0af15c430e89"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:28 crc kubenswrapper[4773]: I0120 19:00:28.932070 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-inventory" (OuterVolumeSpecName: "inventory") pod "3ef8874c-43f1-43c9-ac7c-0af15c430e89" (UID: "3ef8874c-43f1-43c9-ac7c-0af15c430e89"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.001571 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.001613 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8lx7q\" (UniqueName: \"kubernetes.io/projected/3ef8874c-43f1-43c9-ac7c-0af15c430e89-kube-api-access-8lx7q\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.001630 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3ef8874c-43f1-43c9-ac7c-0af15c430e89-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.451678 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.463106 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66" event={"ID":"3ef8874c-43f1-43c9-ac7c-0af15c430e89","Type":"ContainerDied","Data":"20c019dd6dfc037936553ca8b8283032064012141a0a26a9b449b3f1e4c13847"} Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.463158 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20c019dd6dfc037936553ca8b8283032064012141a0a26a9b449b3f1e4c13847" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.542864 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf"] Jan 20 19:00:29 crc kubenswrapper[4773]: E0120 19:00:29.543438 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ef8874c-43f1-43c9-ac7c-0af15c430e89" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.543468 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ef8874c-43f1-43c9-ac7c-0af15c430e89" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.543688 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ef8874c-43f1-43c9-ac7c-0af15c430e89" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.544590 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.551319 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.551547 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.551732 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.551973 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.554405 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf"] Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.620380 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.620471 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f7qt\" (UniqueName: \"kubernetes.io/projected/10bff0cd-1771-46dc-87e8-a7ce91f520c8-kube-api-access-5f7qt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.620507 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.722384 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.722859 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.722978 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f7qt\" (UniqueName: \"kubernetes.io/projected/10bff0cd-1771-46dc-87e8-a7ce91f520c8-kube-api-access-5f7qt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.727655 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.730490 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.754819 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f7qt\" (UniqueName: \"kubernetes.io/projected/10bff0cd-1771-46dc-87e8-a7ce91f520c8-kube-api-access-5f7qt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:29 crc kubenswrapper[4773]: I0120 19:00:29.868571 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:30 crc kubenswrapper[4773]: I0120 19:00:30.402763 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf"] Jan 20 19:00:30 crc kubenswrapper[4773]: W0120 19:00:30.408172 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod10bff0cd_1771_46dc_87e8_a7ce91f520c8.slice/crio-8f72cc4cbf570932cadff541edb94a6e8e43e4502a508775c540956a1d668c23 WatchSource:0}: Error finding container 8f72cc4cbf570932cadff541edb94a6e8e43e4502a508775c540956a1d668c23: Status 404 returned error can't find the container with id 8f72cc4cbf570932cadff541edb94a6e8e43e4502a508775c540956a1d668c23 Jan 20 19:00:30 crc kubenswrapper[4773]: I0120 19:00:30.459789 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" event={"ID":"10bff0cd-1771-46dc-87e8-a7ce91f520c8","Type":"ContainerStarted","Data":"8f72cc4cbf570932cadff541edb94a6e8e43e4502a508775c540956a1d668c23"} Jan 20 19:00:31 crc kubenswrapper[4773]: I0120 19:00:31.448080 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 19:00:31 crc kubenswrapper[4773]: E0120 19:00:31.448588 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:00:31 crc kubenswrapper[4773]: I0120 19:00:31.468892 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" event={"ID":"10bff0cd-1771-46dc-87e8-a7ce91f520c8","Type":"ContainerStarted","Data":"737aaa15d5f9c841779418b751493428e37f357fe0fe9835945320528fe4b9df"} Jan 20 19:00:31 crc kubenswrapper[4773]: I0120 19:00:31.506446 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" podStartSLOduration=2.031381184 podStartE2EDuration="2.506420961s" podCreationTimestamp="2026-01-20 19:00:29 +0000 UTC" firstStartedPulling="2026-01-20 19:00:30.412630498 +0000 UTC m=+1823.334443522" lastFinishedPulling="2026-01-20 19:00:30.887670275 +0000 UTC m=+1823.809483299" observedRunningTime="2026-01-20 19:00:31.487686189 +0000 UTC m=+1824.409499253" watchObservedRunningTime="2026-01-20 19:00:31.506420961 +0000 UTC m=+1824.428234015" Jan 20 19:00:41 crc kubenswrapper[4773]: I0120 19:00:41.551419 4773 generic.go:334] "Generic (PLEG): container finished" podID="10bff0cd-1771-46dc-87e8-a7ce91f520c8" containerID="737aaa15d5f9c841779418b751493428e37f357fe0fe9835945320528fe4b9df" exitCode=0 Jan 20 19:00:41 crc kubenswrapper[4773]: I0120 19:00:41.551510 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" event={"ID":"10bff0cd-1771-46dc-87e8-a7ce91f520c8","Type":"ContainerDied","Data":"737aaa15d5f9c841779418b751493428e37f357fe0fe9835945320528fe4b9df"} Jan 20 19:00:42 crc kubenswrapper[4773]: I0120 19:00:42.997324 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.055119 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-ssh-key-openstack-edpm-ipam\") pod \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.055233 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-inventory\") pod \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.055308 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5f7qt\" (UniqueName: \"kubernetes.io/projected/10bff0cd-1771-46dc-87e8-a7ce91f520c8-kube-api-access-5f7qt\") pod \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\" (UID: \"10bff0cd-1771-46dc-87e8-a7ce91f520c8\") " Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.066010 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10bff0cd-1771-46dc-87e8-a7ce91f520c8-kube-api-access-5f7qt" (OuterVolumeSpecName: "kube-api-access-5f7qt") pod "10bff0cd-1771-46dc-87e8-a7ce91f520c8" (UID: "10bff0cd-1771-46dc-87e8-a7ce91f520c8"). InnerVolumeSpecName "kube-api-access-5f7qt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.082247 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "10bff0cd-1771-46dc-87e8-a7ce91f520c8" (UID: "10bff0cd-1771-46dc-87e8-a7ce91f520c8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.100045 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-inventory" (OuterVolumeSpecName: "inventory") pod "10bff0cd-1771-46dc-87e8-a7ce91f520c8" (UID: "10bff0cd-1771-46dc-87e8-a7ce91f520c8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.157905 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.157943 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/10bff0cd-1771-46dc-87e8-a7ce91f520c8-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.157953 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5f7qt\" (UniqueName: \"kubernetes.io/projected/10bff0cd-1771-46dc-87e8-a7ce91f520c8-kube-api-access-5f7qt\") on node \"crc\" DevicePath \"\"" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.574670 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.574840 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf" event={"ID":"10bff0cd-1771-46dc-87e8-a7ce91f520c8","Type":"ContainerDied","Data":"8f72cc4cbf570932cadff541edb94a6e8e43e4502a508775c540956a1d668c23"} Jan 20 19:00:43 crc kubenswrapper[4773]: I0120 19:00:43.575396 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8f72cc4cbf570932cadff541edb94a6e8e43e4502a508775c540956a1d668c23" Jan 20 19:00:46 crc kubenswrapper[4773]: I0120 19:00:46.447272 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 19:00:46 crc kubenswrapper[4773]: E0120 19:00:46.447540 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:00:51 crc kubenswrapper[4773]: I0120 19:00:51.057374 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gvh4b"] Jan 20 19:00:51 crc kubenswrapper[4773]: I0120 19:00:51.067553 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-gvh4b"] Jan 20 19:00:51 crc kubenswrapper[4773]: I0120 19:00:51.457470 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414429bc-e43b-42f9-8f49-8bc7c4a0ecf4" path="/var/lib/kubelet/pods/414429bc-e43b-42f9-8f49-8bc7c4a0ecf4/volumes" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.148620 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29482261-dqww9"] Jan 20 19:01:00 crc kubenswrapper[4773]: E0120 19:01:00.149360 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10bff0cd-1771-46dc-87e8-a7ce91f520c8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.149381 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="10bff0cd-1771-46dc-87e8-a7ce91f520c8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.149618 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="10bff0cd-1771-46dc-87e8-a7ce91f520c8" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.150221 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.158614 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29482261-dqww9"] Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.256018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-combined-ca-bundle\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.256072 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-fernet-keys\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.256192 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t22gx\" (UniqueName: \"kubernetes.io/projected/5b0951c0-055b-44bd-a686-9a4938af6b4f-kube-api-access-t22gx\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.256254 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-config-data\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.358296 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t22gx\" (UniqueName: \"kubernetes.io/projected/5b0951c0-055b-44bd-a686-9a4938af6b4f-kube-api-access-t22gx\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.358405 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-config-data\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.358500 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-combined-ca-bundle\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.358533 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-fernet-keys\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.365191 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-config-data\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.365739 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-fernet-keys\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.371393 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-combined-ca-bundle\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.376382 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t22gx\" (UniqueName: \"kubernetes.io/projected/5b0951c0-055b-44bd-a686-9a4938af6b4f-kube-api-access-t22gx\") pod \"keystone-cron-29482261-dqww9\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:00 crc kubenswrapper[4773]: I0120 19:01:00.483073 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:01 crc kubenswrapper[4773]: I0120 19:01:00.938250 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29482261-dqww9"] Jan 20 19:01:01 crc kubenswrapper[4773]: I0120 19:01:01.450023 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 19:01:01 crc kubenswrapper[4773]: I0120 19:01:01.733853 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29482261-dqww9" event={"ID":"5b0951c0-055b-44bd-a686-9a4938af6b4f","Type":"ContainerStarted","Data":"e6c25d2d7edcc9942250bf6b80ef72165e3d0474f9d6e264bd2c80435f9b0198"} Jan 20 19:01:01 crc kubenswrapper[4773]: I0120 19:01:01.733904 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29482261-dqww9" event={"ID":"5b0951c0-055b-44bd-a686-9a4938af6b4f","Type":"ContainerStarted","Data":"42f78c362929a3d7f96521c65ddfa3ce1ae0af7995a4009264a337437c52809c"} Jan 20 19:01:01 crc kubenswrapper[4773]: I0120 19:01:01.737372 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"7ef51181ae4526cf4dd383022f5cd069e75cd47240d82e55ea2f8b59a5c7eef7"} Jan 20 19:01:01 crc kubenswrapper[4773]: I0120 19:01:01.773847 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29482261-dqww9" podStartSLOduration=1.773823658 podStartE2EDuration="1.773823658s" podCreationTimestamp="2026-01-20 19:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:01:01.766232776 +0000 UTC m=+1854.688045880" watchObservedRunningTime="2026-01-20 19:01:01.773823658 +0000 UTC m=+1854.695636712" Jan 20 19:01:03 crc kubenswrapper[4773]: I0120 19:01:03.756790 4773 generic.go:334] "Generic (PLEG): container finished" podID="5b0951c0-055b-44bd-a686-9a4938af6b4f" containerID="e6c25d2d7edcc9942250bf6b80ef72165e3d0474f9d6e264bd2c80435f9b0198" exitCode=0 Jan 20 19:01:03 crc kubenswrapper[4773]: I0120 19:01:03.756894 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29482261-dqww9" event={"ID":"5b0951c0-055b-44bd-a686-9a4938af6b4f","Type":"ContainerDied","Data":"e6c25d2d7edcc9942250bf6b80ef72165e3d0474f9d6e264bd2c80435f9b0198"} Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.086743 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.246500 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-fernet-keys\") pod \"5b0951c0-055b-44bd-a686-9a4938af6b4f\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.246602 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t22gx\" (UniqueName: \"kubernetes.io/projected/5b0951c0-055b-44bd-a686-9a4938af6b4f-kube-api-access-t22gx\") pod \"5b0951c0-055b-44bd-a686-9a4938af6b4f\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.246628 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-combined-ca-bundle\") pod \"5b0951c0-055b-44bd-a686-9a4938af6b4f\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.246740 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-config-data\") pod \"5b0951c0-055b-44bd-a686-9a4938af6b4f\" (UID: \"5b0951c0-055b-44bd-a686-9a4938af6b4f\") " Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.252863 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "5b0951c0-055b-44bd-a686-9a4938af6b4f" (UID: "5b0951c0-055b-44bd-a686-9a4938af6b4f"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.260109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b0951c0-055b-44bd-a686-9a4938af6b4f-kube-api-access-t22gx" (OuterVolumeSpecName: "kube-api-access-t22gx") pod "5b0951c0-055b-44bd-a686-9a4938af6b4f" (UID: "5b0951c0-055b-44bd-a686-9a4938af6b4f"). InnerVolumeSpecName "kube-api-access-t22gx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.285582 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5b0951c0-055b-44bd-a686-9a4938af6b4f" (UID: "5b0951c0-055b-44bd-a686-9a4938af6b4f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.312856 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-config-data" (OuterVolumeSpecName: "config-data") pod "5b0951c0-055b-44bd-a686-9a4938af6b4f" (UID: "5b0951c0-055b-44bd-a686-9a4938af6b4f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.349198 4773 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.349375 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t22gx\" (UniqueName: \"kubernetes.io/projected/5b0951c0-055b-44bd-a686-9a4938af6b4f-kube-api-access-t22gx\") on node \"crc\" DevicePath \"\"" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.349451 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.349548 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5b0951c0-055b-44bd-a686-9a4938af6b4f-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:01:05 crc kubenswrapper[4773]: E0120 19:01:05.626995 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b0951c0_055b_44bd_a686_9a4938af6b4f.slice/crio-42f78c362929a3d7f96521c65ddfa3ce1ae0af7995a4009264a337437c52809c\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5b0951c0_055b_44bd_a686_9a4938af6b4f.slice\": RecentStats: unable to find data in memory cache]" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.774556 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29482261-dqww9" event={"ID":"5b0951c0-055b-44bd-a686-9a4938af6b4f","Type":"ContainerDied","Data":"42f78c362929a3d7f96521c65ddfa3ce1ae0af7995a4009264a337437c52809c"} Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.774593 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="42f78c362929a3d7f96521c65ddfa3ce1ae0af7995a4009264a337437c52809c" Jan 20 19:01:05 crc kubenswrapper[4773]: I0120 19:01:05.774610 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29482261-dqww9" Jan 20 19:01:11 crc kubenswrapper[4773]: I0120 19:01:11.030029 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-p6rjg"] Jan 20 19:01:11 crc kubenswrapper[4773]: I0120 19:01:11.037317 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-p6rjg"] Jan 20 19:01:11 crc kubenswrapper[4773]: I0120 19:01:11.456259 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61de3b4b-bcb7-4521-92e6-af87d03407ee" path="/var/lib/kubelet/pods/61de3b4b-bcb7-4521-92e6-af87d03407ee/volumes" Jan 20 19:01:14 crc kubenswrapper[4773]: I0120 19:01:14.037333 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbmt7"] Jan 20 19:01:14 crc kubenswrapper[4773]: I0120 19:01:14.046676 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-qbmt7"] Jan 20 19:01:15 crc kubenswrapper[4773]: I0120 19:01:15.458115 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f9293b5-8288-4a19-b3ac-03d8026dbf06" path="/var/lib/kubelet/pods/9f9293b5-8288-4a19-b3ac-03d8026dbf06/volumes" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.215793 4773 scope.go:117] "RemoveContainer" containerID="6bc940e28a0f00ff3974f58aa4d5ae733c405db0f07aec2ee2eeb84b82a418e2" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.250383 4773 scope.go:117] "RemoveContainer" containerID="2ccbf39de56ce437fa73601c164599a09e48be8a2f2534b1c75fa3d80b294c65" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.288995 4773 scope.go:117] "RemoveContainer" containerID="eb1a71418be4c383e084090dcdbe66bfe9ab929c68b106dc0659e9d31cfdbff6" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.364851 4773 scope.go:117] "RemoveContainer" containerID="8a98a5857f7bd2ab4f59a3057108741eeb668e092184b2f6b6f9df7ea5067a15" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.413078 4773 scope.go:117] "RemoveContainer" containerID="ef9d86fc8f98618790c525e7960f8bfe056b8e5e834ec2dbd2285ecb0f1d9ce3" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.435072 4773 scope.go:117] "RemoveContainer" containerID="9a0076ae19fbe6544a21612cf495b50f269fad130f0414b9b8f4c443dba234b3" Jan 20 19:01:19 crc kubenswrapper[4773]: I0120 19:01:19.471127 4773 scope.go:117] "RemoveContainer" containerID="ee4d9edf9c01606465809a2b2e95bca37087aff51d7e53e121d12c29841bd18d" Jan 20 19:01:58 crc kubenswrapper[4773]: I0120 19:01:58.040744 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-b4hvr"] Jan 20 19:01:58 crc kubenswrapper[4773]: I0120 19:01:58.048516 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-b4hvr"] Jan 20 19:01:59 crc kubenswrapper[4773]: I0120 19:01:59.456541 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24c1bc90-8fe0-41b4-a7ba-7e15bc787386" path="/var/lib/kubelet/pods/24c1bc90-8fe0-41b4-a7ba-7e15bc787386/volumes" Jan 20 19:02:19 crc kubenswrapper[4773]: I0120 19:02:19.612461 4773 scope.go:117] "RemoveContainer" containerID="7292aa755477db4cf6213001ef3f22ae8bf8250a6e1509481020c0348c2c81e0" Jan 20 19:03:28 crc kubenswrapper[4773]: I0120 19:03:28.170124 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:03:28 crc kubenswrapper[4773]: I0120 19:03:28.170792 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.784150 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z8cll"] Jan 20 19:03:54 crc kubenswrapper[4773]: E0120 19:03:54.785107 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b0951c0-055b-44bd-a686-9a4938af6b4f" containerName="keystone-cron" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.785125 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b0951c0-055b-44bd-a686-9a4938af6b4f" containerName="keystone-cron" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.785339 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b0951c0-055b-44bd-a686-9a4938af6b4f" containerName="keystone-cron" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.786842 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.800172 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8cll"] Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.943211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-catalog-content\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.943281 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfw8p\" (UniqueName: \"kubernetes.io/projected/187a0b19-a497-4133-a631-23e2c38c8e90-kube-api-access-qfw8p\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:54 crc kubenswrapper[4773]: I0120 19:03:54.943556 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-utilities\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.045193 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-catalog-content\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.045267 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfw8p\" (UniqueName: \"kubernetes.io/projected/187a0b19-a497-4133-a631-23e2c38c8e90-kube-api-access-qfw8p\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.045307 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-utilities\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.045727 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-catalog-content\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.045791 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-utilities\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.074723 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfw8p\" (UniqueName: \"kubernetes.io/projected/187a0b19-a497-4133-a631-23e2c38c8e90-kube-api-access-qfw8p\") pod \"redhat-marketplace-z8cll\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.108210 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:03:55 crc kubenswrapper[4773]: I0120 19:03:55.560377 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8cll"] Jan 20 19:03:56 crc kubenswrapper[4773]: I0120 19:03:56.223253 4773 generic.go:334] "Generic (PLEG): container finished" podID="187a0b19-a497-4133-a631-23e2c38c8e90" containerID="483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc" exitCode=0 Jan 20 19:03:56 crc kubenswrapper[4773]: I0120 19:03:56.223359 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8cll" event={"ID":"187a0b19-a497-4133-a631-23e2c38c8e90","Type":"ContainerDied","Data":"483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc"} Jan 20 19:03:56 crc kubenswrapper[4773]: I0120 19:03:56.223617 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8cll" event={"ID":"187a0b19-a497-4133-a631-23e2c38c8e90","Type":"ContainerStarted","Data":"610302889b4174a8a42476f71a30781764aa2d78797362e8ed51738a7af8d517"} Jan 20 19:03:56 crc kubenswrapper[4773]: I0120 19:03:56.225237 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 19:03:57 crc kubenswrapper[4773]: I0120 19:03:57.232994 4773 generic.go:334] "Generic (PLEG): container finished" podID="187a0b19-a497-4133-a631-23e2c38c8e90" containerID="b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5" exitCode=0 Jan 20 19:03:57 crc kubenswrapper[4773]: I0120 19:03:57.233053 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8cll" event={"ID":"187a0b19-a497-4133-a631-23e2c38c8e90","Type":"ContainerDied","Data":"b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5"} Jan 20 19:03:58 crc kubenswrapper[4773]: I0120 19:03:58.170436 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:03:58 crc kubenswrapper[4773]: I0120 19:03:58.170777 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:03:58 crc kubenswrapper[4773]: I0120 19:03:58.255734 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8cll" event={"ID":"187a0b19-a497-4133-a631-23e2c38c8e90","Type":"ContainerStarted","Data":"fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692"} Jan 20 19:03:58 crc kubenswrapper[4773]: I0120 19:03:58.278000 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z8cll" podStartSLOduration=2.77191214 podStartE2EDuration="4.277983701s" podCreationTimestamp="2026-01-20 19:03:54 +0000 UTC" firstStartedPulling="2026-01-20 19:03:56.225025246 +0000 UTC m=+2029.146838270" lastFinishedPulling="2026-01-20 19:03:57.731096807 +0000 UTC m=+2030.652909831" observedRunningTime="2026-01-20 19:03:58.272466025 +0000 UTC m=+2031.194279059" watchObservedRunningTime="2026-01-20 19:03:58.277983701 +0000 UTC m=+2031.199796725" Jan 20 19:04:05 crc kubenswrapper[4773]: I0120 19:04:05.108351 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:04:05 crc kubenswrapper[4773]: I0120 19:04:05.109022 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:04:05 crc kubenswrapper[4773]: I0120 19:04:05.151224 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:04:05 crc kubenswrapper[4773]: I0120 19:04:05.349096 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:04:05 crc kubenswrapper[4773]: I0120 19:04:05.397295 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8cll"] Jan 20 19:04:07 crc kubenswrapper[4773]: I0120 19:04:07.323612 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z8cll" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="registry-server" containerID="cri-o://fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692" gracePeriod=2 Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.007781 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.185883 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfw8p\" (UniqueName: \"kubernetes.io/projected/187a0b19-a497-4133-a631-23e2c38c8e90-kube-api-access-qfw8p\") pod \"187a0b19-a497-4133-a631-23e2c38c8e90\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.186020 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-utilities\") pod \"187a0b19-a497-4133-a631-23e2c38c8e90\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.186126 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-catalog-content\") pod \"187a0b19-a497-4133-a631-23e2c38c8e90\" (UID: \"187a0b19-a497-4133-a631-23e2c38c8e90\") " Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.187473 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-utilities" (OuterVolumeSpecName: "utilities") pod "187a0b19-a497-4133-a631-23e2c38c8e90" (UID: "187a0b19-a497-4133-a631-23e2c38c8e90"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.192972 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/187a0b19-a497-4133-a631-23e2c38c8e90-kube-api-access-qfw8p" (OuterVolumeSpecName: "kube-api-access-qfw8p") pod "187a0b19-a497-4133-a631-23e2c38c8e90" (UID: "187a0b19-a497-4133-a631-23e2c38c8e90"). InnerVolumeSpecName "kube-api-access-qfw8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.216337 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "187a0b19-a497-4133-a631-23e2c38c8e90" (UID: "187a0b19-a497-4133-a631-23e2c38c8e90"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.288145 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfw8p\" (UniqueName: \"kubernetes.io/projected/187a0b19-a497-4133-a631-23e2c38c8e90-kube-api-access-qfw8p\") on node \"crc\" DevicePath \"\"" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.288184 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.288195 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/187a0b19-a497-4133-a631-23e2c38c8e90-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.333767 4773 generic.go:334] "Generic (PLEG): container finished" podID="187a0b19-a497-4133-a631-23e2c38c8e90" containerID="fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692" exitCode=0 Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.333812 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8cll" event={"ID":"187a0b19-a497-4133-a631-23e2c38c8e90","Type":"ContainerDied","Data":"fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692"} Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.333830 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8cll" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.333848 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8cll" event={"ID":"187a0b19-a497-4133-a631-23e2c38c8e90","Type":"ContainerDied","Data":"610302889b4174a8a42476f71a30781764aa2d78797362e8ed51738a7af8d517"} Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.333868 4773 scope.go:117] "RemoveContainer" containerID="fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.357600 4773 scope.go:117] "RemoveContainer" containerID="b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.374087 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8cll"] Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.381634 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8cll"] Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.398130 4773 scope.go:117] "RemoveContainer" containerID="483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.424333 4773 scope.go:117] "RemoveContainer" containerID="fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692" Jan 20 19:04:08 crc kubenswrapper[4773]: E0120 19:04:08.424783 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692\": container with ID starting with fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692 not found: ID does not exist" containerID="fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.424820 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692"} err="failed to get container status \"fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692\": rpc error: code = NotFound desc = could not find container \"fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692\": container with ID starting with fbfc4e23d0a72ac44b7a83c6bc7d6bed377e90e0215102917ea278e6dcbe0692 not found: ID does not exist" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.424848 4773 scope.go:117] "RemoveContainer" containerID="b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5" Jan 20 19:04:08 crc kubenswrapper[4773]: E0120 19:04:08.425273 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5\": container with ID starting with b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5 not found: ID does not exist" containerID="b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.425306 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5"} err="failed to get container status \"b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5\": rpc error: code = NotFound desc = could not find container \"b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5\": container with ID starting with b4b2515385517cccf51b0a026e3e5b232b2b4eed1a3c887124ddd45c9bb57ec5 not found: ID does not exist" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.425324 4773 scope.go:117] "RemoveContainer" containerID="483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc" Jan 20 19:04:08 crc kubenswrapper[4773]: E0120 19:04:08.425584 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc\": container with ID starting with 483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc not found: ID does not exist" containerID="483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc" Jan 20 19:04:08 crc kubenswrapper[4773]: I0120 19:04:08.425615 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc"} err="failed to get container status \"483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc\": rpc error: code = NotFound desc = could not find container \"483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc\": container with ID starting with 483c3982c526c142a8f5a1481d4e11663ea81d50f899c877168dcd3282945bcc not found: ID does not exist" Jan 20 19:04:09 crc kubenswrapper[4773]: I0120 19:04:09.457658 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" path="/var/lib/kubelet/pods/187a0b19-a497-4133-a631-23e2c38c8e90/volumes" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.070380 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qw5jk"] Jan 20 19:04:12 crc kubenswrapper[4773]: E0120 19:04:12.071317 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="extract-content" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.071334 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="extract-content" Jan 20 19:04:12 crc kubenswrapper[4773]: E0120 19:04:12.071353 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="extract-utilities" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.071360 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="extract-utilities" Jan 20 19:04:12 crc kubenswrapper[4773]: E0120 19:04:12.071367 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="registry-server" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.071373 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="registry-server" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.071543 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="187a0b19-a497-4133-a631-23e2c38c8e90" containerName="registry-server" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.073172 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.080350 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qw5jk"] Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.150211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-utilities\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.150312 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8p6px\" (UniqueName: \"kubernetes.io/projected/b623f23a-f663-4807-903e-2633ba066f8a-kube-api-access-8p6px\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.150367 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-catalog-content\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.252329 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-catalog-content\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.252651 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-utilities\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.252690 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8p6px\" (UniqueName: \"kubernetes.io/projected/b623f23a-f663-4807-903e-2633ba066f8a-kube-api-access-8p6px\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.253135 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-catalog-content\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.253268 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-utilities\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.278142 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8p6px\" (UniqueName: \"kubernetes.io/projected/b623f23a-f663-4807-903e-2633ba066f8a-kube-api-access-8p6px\") pod \"redhat-operators-qw5jk\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.442765 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:12 crc kubenswrapper[4773]: I0120 19:04:12.883867 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qw5jk"] Jan 20 19:04:13 crc kubenswrapper[4773]: I0120 19:04:13.374285 4773 generic.go:334] "Generic (PLEG): container finished" podID="b623f23a-f663-4807-903e-2633ba066f8a" containerID="cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161" exitCode=0 Jan 20 19:04:13 crc kubenswrapper[4773]: I0120 19:04:13.374394 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerDied","Data":"cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161"} Jan 20 19:04:13 crc kubenswrapper[4773]: I0120 19:04:13.375123 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerStarted","Data":"988b02aa46e97c905d50ba849c77f22209883cc3c9563e2d9c3d142f12351377"} Jan 20 19:04:14 crc kubenswrapper[4773]: I0120 19:04:14.385173 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerStarted","Data":"98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e"} Jan 20 19:04:15 crc kubenswrapper[4773]: I0120 19:04:15.395028 4773 generic.go:334] "Generic (PLEG): container finished" podID="b623f23a-f663-4807-903e-2633ba066f8a" containerID="98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e" exitCode=0 Jan 20 19:04:15 crc kubenswrapper[4773]: I0120 19:04:15.395072 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerDied","Data":"98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e"} Jan 20 19:04:16 crc kubenswrapper[4773]: I0120 19:04:16.405737 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerStarted","Data":"6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f"} Jan 20 19:04:16 crc kubenswrapper[4773]: I0120 19:04:16.430069 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qw5jk" podStartSLOduration=1.683221605 podStartE2EDuration="4.43004898s" podCreationTimestamp="2026-01-20 19:04:12 +0000 UTC" firstStartedPulling="2026-01-20 19:04:13.376442444 +0000 UTC m=+2046.298255468" lastFinishedPulling="2026-01-20 19:04:16.123269819 +0000 UTC m=+2049.045082843" observedRunningTime="2026-01-20 19:04:16.421833013 +0000 UTC m=+2049.343646057" watchObservedRunningTime="2026-01-20 19:04:16.43004898 +0000 UTC m=+2049.351862004" Jan 20 19:04:22 crc kubenswrapper[4773]: I0120 19:04:22.443969 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:22 crc kubenswrapper[4773]: I0120 19:04:22.445108 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:22 crc kubenswrapper[4773]: I0120 19:04:22.500632 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:22 crc kubenswrapper[4773]: I0120 19:04:22.590228 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:22 crc kubenswrapper[4773]: I0120 19:04:22.739747 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qw5jk"] Jan 20 19:04:24 crc kubenswrapper[4773]: I0120 19:04:24.473142 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qw5jk" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="registry-server" containerID="cri-o://6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f" gracePeriod=2 Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.018917 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.092337 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-catalog-content\") pod \"b623f23a-f663-4807-903e-2633ba066f8a\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.092788 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8p6px\" (UniqueName: \"kubernetes.io/projected/b623f23a-f663-4807-903e-2633ba066f8a-kube-api-access-8p6px\") pod \"b623f23a-f663-4807-903e-2633ba066f8a\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.093107 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-utilities\") pod \"b623f23a-f663-4807-903e-2633ba066f8a\" (UID: \"b623f23a-f663-4807-903e-2633ba066f8a\") " Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.093911 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-utilities" (OuterVolumeSpecName: "utilities") pod "b623f23a-f663-4807-903e-2633ba066f8a" (UID: "b623f23a-f663-4807-903e-2633ba066f8a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.112043 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b623f23a-f663-4807-903e-2633ba066f8a-kube-api-access-8p6px" (OuterVolumeSpecName: "kube-api-access-8p6px") pod "b623f23a-f663-4807-903e-2633ba066f8a" (UID: "b623f23a-f663-4807-903e-2633ba066f8a"). InnerVolumeSpecName "kube-api-access-8p6px". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.194798 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8p6px\" (UniqueName: \"kubernetes.io/projected/b623f23a-f663-4807-903e-2633ba066f8a-kube-api-access-8p6px\") on node \"crc\" DevicePath \"\"" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.194843 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.481965 4773 generic.go:334] "Generic (PLEG): container finished" podID="b623f23a-f663-4807-903e-2633ba066f8a" containerID="6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f" exitCode=0 Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.482006 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerDied","Data":"6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f"} Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.482036 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qw5jk" event={"ID":"b623f23a-f663-4807-903e-2633ba066f8a","Type":"ContainerDied","Data":"988b02aa46e97c905d50ba849c77f22209883cc3c9563e2d9c3d142f12351377"} Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.482053 4773 scope.go:117] "RemoveContainer" containerID="6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.482102 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qw5jk" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.500105 4773 scope.go:117] "RemoveContainer" containerID="98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.534419 4773 scope.go:117] "RemoveContainer" containerID="cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.566140 4773 scope.go:117] "RemoveContainer" containerID="6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f" Jan 20 19:04:25 crc kubenswrapper[4773]: E0120 19:04:25.566517 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f\": container with ID starting with 6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f not found: ID does not exist" containerID="6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.566550 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f"} err="failed to get container status \"6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f\": rpc error: code = NotFound desc = could not find container \"6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f\": container with ID starting with 6f751af477f323eaeaf4cfade84b8b921f606dd9d3a941a82b22f91f883d8f1f not found: ID does not exist" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.566569 4773 scope.go:117] "RemoveContainer" containerID="98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e" Jan 20 19:04:25 crc kubenswrapper[4773]: E0120 19:04:25.567006 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e\": container with ID starting with 98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e not found: ID does not exist" containerID="98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.567091 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e"} err="failed to get container status \"98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e\": rpc error: code = NotFound desc = could not find container \"98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e\": container with ID starting with 98a31d5329aa53bf731b998fc6aac7e21d0dd902631ce2a486ea0ca94058919e not found: ID does not exist" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.567158 4773 scope.go:117] "RemoveContainer" containerID="cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161" Jan 20 19:04:25 crc kubenswrapper[4773]: E0120 19:04:25.567663 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161\": container with ID starting with cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161 not found: ID does not exist" containerID="cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161" Jan 20 19:04:25 crc kubenswrapper[4773]: I0120 19:04:25.567736 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161"} err="failed to get container status \"cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161\": rpc error: code = NotFound desc = could not find container \"cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161\": container with ID starting with cdf4c07c84cc7ad7c54dba568c02850065479a944dbe0e2822973daa5339c161 not found: ID does not exist" Jan 20 19:04:26 crc kubenswrapper[4773]: I0120 19:04:26.405508 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b623f23a-f663-4807-903e-2633ba066f8a" (UID: "b623f23a-f663-4807-903e-2633ba066f8a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:04:26 crc kubenswrapper[4773]: I0120 19:04:26.418398 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b623f23a-f663-4807-903e-2633ba066f8a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:04:26 crc kubenswrapper[4773]: I0120 19:04:26.722875 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qw5jk"] Jan 20 19:04:26 crc kubenswrapper[4773]: I0120 19:04:26.734290 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qw5jk"] Jan 20 19:04:27 crc kubenswrapper[4773]: I0120 19:04:27.457845 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b623f23a-f663-4807-903e-2633ba066f8a" path="/var/lib/kubelet/pods/b623f23a-f663-4807-903e-2633ba066f8a/volumes" Jan 20 19:04:28 crc kubenswrapper[4773]: I0120 19:04:28.170539 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:04:28 crc kubenswrapper[4773]: I0120 19:04:28.170603 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:04:28 crc kubenswrapper[4773]: I0120 19:04:28.170648 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:04:28 crc kubenswrapper[4773]: I0120 19:04:28.171422 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7ef51181ae4526cf4dd383022f5cd069e75cd47240d82e55ea2f8b59a5c7eef7"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:04:28 crc kubenswrapper[4773]: I0120 19:04:28.171478 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://7ef51181ae4526cf4dd383022f5cd069e75cd47240d82e55ea2f8b59a5c7eef7" gracePeriod=600 Jan 20 19:04:29 crc kubenswrapper[4773]: I0120 19:04:29.518085 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="7ef51181ae4526cf4dd383022f5cd069e75cd47240d82e55ea2f8b59a5c7eef7" exitCode=0 Jan 20 19:04:29 crc kubenswrapper[4773]: I0120 19:04:29.518149 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"7ef51181ae4526cf4dd383022f5cd069e75cd47240d82e55ea2f8b59a5c7eef7"} Jan 20 19:04:29 crc kubenswrapper[4773]: I0120 19:04:29.520268 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1"} Jan 20 19:04:29 crc kubenswrapper[4773]: I0120 19:04:29.520348 4773 scope.go:117] "RemoveContainer" containerID="ce9aed1d163b3887a1d1032f9c03b611a4ecaff0809c53905de6852740ee0630" Jan 20 19:04:58 crc kubenswrapper[4773]: E0120 19:04:58.659378 4773 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.39:37860->38.102.83.39:34695: write tcp 38.102.83.39:37860->38.102.83.39:34695: write: broken pipe Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.721489 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-lcnwg"] Jan 20 19:05:00 crc kubenswrapper[4773]: E0120 19:05:00.722061 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="registry-server" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.722073 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="registry-server" Jan 20 19:05:00 crc kubenswrapper[4773]: E0120 19:05:00.722091 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="extract-content" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.722097 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="extract-content" Jan 20 19:05:00 crc kubenswrapper[4773]: E0120 19:05:00.722116 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="extract-utilities" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.722122 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="extract-utilities" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.722283 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b623f23a-f663-4807-903e-2633ba066f8a" containerName="registry-server" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.723493 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.731795 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lcnwg"] Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.753964 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7f96\" (UniqueName: \"kubernetes.io/projected/461d1293-e47b-49e3-a6eb-bef90ad29792-kube-api-access-t7f96\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.754599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-catalog-content\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.754639 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-utilities\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.856366 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-utilities\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.856845 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7f96\" (UniqueName: \"kubernetes.io/projected/461d1293-e47b-49e3-a6eb-bef90ad29792-kube-api-access-t7f96\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.856887 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-catalog-content\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.856987 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-utilities\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.857393 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-catalog-content\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:00 crc kubenswrapper[4773]: I0120 19:05:00.876825 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7f96\" (UniqueName: \"kubernetes.io/projected/461d1293-e47b-49e3-a6eb-bef90ad29792-kube-api-access-t7f96\") pod \"certified-operators-lcnwg\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:01 crc kubenswrapper[4773]: I0120 19:05:01.056973 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:01 crc kubenswrapper[4773]: I0120 19:05:01.575318 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-lcnwg"] Jan 20 19:05:01 crc kubenswrapper[4773]: W0120 19:05:01.600828 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod461d1293_e47b_49e3_a6eb_bef90ad29792.slice/crio-c560847c004ebe8cff1a891fc8db5c8ada8ba7a9bd36df8c174dbd6db9fb49ec WatchSource:0}: Error finding container c560847c004ebe8cff1a891fc8db5c8ada8ba7a9bd36df8c174dbd6db9fb49ec: Status 404 returned error can't find the container with id c560847c004ebe8cff1a891fc8db5c8ada8ba7a9bd36df8c174dbd6db9fb49ec Jan 20 19:05:01 crc kubenswrapper[4773]: I0120 19:05:01.783465 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerStarted","Data":"c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855"} Jan 20 19:05:01 crc kubenswrapper[4773]: I0120 19:05:01.783792 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerStarted","Data":"c560847c004ebe8cff1a891fc8db5c8ada8ba7a9bd36df8c174dbd6db9fb49ec"} Jan 20 19:05:02 crc kubenswrapper[4773]: I0120 19:05:02.794120 4773 generic.go:334] "Generic (PLEG): container finished" podID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerID="c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855" exitCode=0 Jan 20 19:05:02 crc kubenswrapper[4773]: I0120 19:05:02.794160 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerDied","Data":"c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855"} Jan 20 19:05:03 crc kubenswrapper[4773]: I0120 19:05:03.805663 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerStarted","Data":"0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47"} Jan 20 19:05:04 crc kubenswrapper[4773]: I0120 19:05:04.827362 4773 generic.go:334] "Generic (PLEG): container finished" podID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerID="0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47" exitCode=0 Jan 20 19:05:04 crc kubenswrapper[4773]: I0120 19:05:04.827472 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerDied","Data":"0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47"} Jan 20 19:05:05 crc kubenswrapper[4773]: I0120 19:05:05.841672 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerStarted","Data":"5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627"} Jan 20 19:05:11 crc kubenswrapper[4773]: I0120 19:05:11.057377 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:11 crc kubenswrapper[4773]: I0120 19:05:11.058150 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:11 crc kubenswrapper[4773]: I0120 19:05:11.214194 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:11 crc kubenswrapper[4773]: I0120 19:05:11.236717 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-lcnwg" podStartSLOduration=8.673459784 podStartE2EDuration="11.236703083s" podCreationTimestamp="2026-01-20 19:05:00 +0000 UTC" firstStartedPulling="2026-01-20 19:05:02.796147026 +0000 UTC m=+2095.717960050" lastFinishedPulling="2026-01-20 19:05:05.359390325 +0000 UTC m=+2098.281203349" observedRunningTime="2026-01-20 19:05:05.860595191 +0000 UTC m=+2098.782408235" watchObservedRunningTime="2026-01-20 19:05:11.236703083 +0000 UTC m=+2104.158516107" Jan 20 19:05:11 crc kubenswrapper[4773]: I0120 19:05:11.943556 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:13 crc kubenswrapper[4773]: I0120 19:05:13.289065 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lcnwg"] Jan 20 19:05:13 crc kubenswrapper[4773]: I0120 19:05:13.913239 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-lcnwg" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="registry-server" containerID="cri-o://5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627" gracePeriod=2 Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.606371 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.745046 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t7f96\" (UniqueName: \"kubernetes.io/projected/461d1293-e47b-49e3-a6eb-bef90ad29792-kube-api-access-t7f96\") pod \"461d1293-e47b-49e3-a6eb-bef90ad29792\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.745142 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-catalog-content\") pod \"461d1293-e47b-49e3-a6eb-bef90ad29792\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.745230 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-utilities\") pod \"461d1293-e47b-49e3-a6eb-bef90ad29792\" (UID: \"461d1293-e47b-49e3-a6eb-bef90ad29792\") " Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.746191 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-utilities" (OuterVolumeSpecName: "utilities") pod "461d1293-e47b-49e3-a6eb-bef90ad29792" (UID: "461d1293-e47b-49e3-a6eb-bef90ad29792"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.752238 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/461d1293-e47b-49e3-a6eb-bef90ad29792-kube-api-access-t7f96" (OuterVolumeSpecName: "kube-api-access-t7f96") pod "461d1293-e47b-49e3-a6eb-bef90ad29792" (UID: "461d1293-e47b-49e3-a6eb-bef90ad29792"). InnerVolumeSpecName "kube-api-access-t7f96". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.790862 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "461d1293-e47b-49e3-a6eb-bef90ad29792" (UID: "461d1293-e47b-49e3-a6eb-bef90ad29792"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.847242 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t7f96\" (UniqueName: \"kubernetes.io/projected/461d1293-e47b-49e3-a6eb-bef90ad29792-kube-api-access-t7f96\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.847278 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.847287 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/461d1293-e47b-49e3-a6eb-bef90ad29792-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.922797 4773 generic.go:334] "Generic (PLEG): container finished" podID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerID="5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627" exitCode=0 Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.922842 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerDied","Data":"5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627"} Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.922865 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-lcnwg" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.922886 4773 scope.go:117] "RemoveContainer" containerID="5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.922873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-lcnwg" event={"ID":"461d1293-e47b-49e3-a6eb-bef90ad29792","Type":"ContainerDied","Data":"c560847c004ebe8cff1a891fc8db5c8ada8ba7a9bd36df8c174dbd6db9fb49ec"} Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.942114 4773 scope.go:117] "RemoveContainer" containerID="0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.966074 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-lcnwg"] Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.969585 4773 scope.go:117] "RemoveContainer" containerID="c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855" Jan 20 19:05:14 crc kubenswrapper[4773]: I0120 19:05:14.972719 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-lcnwg"] Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.002713 4773 scope.go:117] "RemoveContainer" containerID="5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627" Jan 20 19:05:15 crc kubenswrapper[4773]: E0120 19:05:15.003191 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627\": container with ID starting with 5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627 not found: ID does not exist" containerID="5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627" Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.003231 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627"} err="failed to get container status \"5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627\": rpc error: code = NotFound desc = could not find container \"5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627\": container with ID starting with 5d71369ebc3f5f2f6c05ea456ea3c21f83b8dbbbebeac0abb1320a0e68e49627 not found: ID does not exist" Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.003256 4773 scope.go:117] "RemoveContainer" containerID="0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47" Jan 20 19:05:15 crc kubenswrapper[4773]: E0120 19:05:15.003614 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47\": container with ID starting with 0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47 not found: ID does not exist" containerID="0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47" Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.003644 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47"} err="failed to get container status \"0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47\": rpc error: code = NotFound desc = could not find container \"0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47\": container with ID starting with 0a402e8bd328459e72644f24e39bfcc9c03c3ce27700efa62cd94ee9cf817f47 not found: ID does not exist" Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.003666 4773 scope.go:117] "RemoveContainer" containerID="c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855" Jan 20 19:05:15 crc kubenswrapper[4773]: E0120 19:05:15.004016 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855\": container with ID starting with c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855 not found: ID does not exist" containerID="c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855" Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.004038 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855"} err="failed to get container status \"c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855\": rpc error: code = NotFound desc = could not find container \"c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855\": container with ID starting with c0c9efbcbb29f11ffeeaa654ad3501e849067a00d9ab4179c8e180d669a6f855 not found: ID does not exist" Jan 20 19:05:15 crc kubenswrapper[4773]: I0120 19:05:15.465313 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" path="/var/lib/kubelet/pods/461d1293-e47b-49e3-a6eb-bef90ad29792/volumes" Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.501353 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.508478 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.517246 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.524721 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhxwd"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.531337 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.538871 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-khn66"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.545765 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-jhxwd"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.557048 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-tgqvf"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.564318 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-fqth8"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.586911 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-87xkt"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.593415 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.601526 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.607715 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-t89z8"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.614046 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.620280 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-q7gtj"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.626963 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.633344 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-vgq6v"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.639260 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-dzlzw"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.645703 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw"] Jan 20 19:05:23 crc kubenswrapper[4773]: I0120 19:05:23.651475 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-7cmjw"] Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.457221 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="02289c77-b6e5-4419-8dc4-597648db0e01" path="/var/lib/kubelet/pods/02289c77-b6e5-4419-8dc4-597648db0e01/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.457892 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0dd5218f-c5ee-4e0b-83bb-ab17d1887596" path="/var/lib/kubelet/pods/0dd5218f-c5ee-4e0b-83bb-ab17d1887596/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.458562 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10bff0cd-1771-46dc-87e8-a7ce91f520c8" path="/var/lib/kubelet/pods/10bff0cd-1771-46dc-87e8-a7ce91f520c8/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.459236 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ef8874c-43f1-43c9-ac7c-0af15c430e89" path="/var/lib/kubelet/pods/3ef8874c-43f1-43c9-ac7c-0af15c430e89/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.460458 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4966c538-33c7-4d94-9705-0081ce04e9ef" path="/var/lib/kubelet/pods/4966c538-33c7-4d94-9705-0081ce04e9ef/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.461247 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4be003e8-2c0f-45c8-944d-b126c8cbd1b0" path="/var/lib/kubelet/pods/4be003e8-2c0f-45c8-944d-b126c8cbd1b0/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.462055 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f64745d-73ee-4219-b71f-b08d15f94f68" path="/var/lib/kubelet/pods/5f64745d-73ee-4219-b71f-b08d15f94f68/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.463375 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6fa7427c-5ab8-4d6d-b81e-999ba155ae7f" path="/var/lib/kubelet/pods/6fa7427c-5ab8-4d6d-b81e-999ba155ae7f/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.464061 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f7fa4e8-571e-47fe-9e86-e83acb77eb77" path="/var/lib/kubelet/pods/8f7fa4e8-571e-47fe-9e86-e83acb77eb77/volumes" Jan 20 19:05:25 crc kubenswrapper[4773]: I0120 19:05:25.464729 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf" path="/var/lib/kubelet/pods/cc27f5b0-829a-4c6b-851b-eb6e4cdd53bf/volumes" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.492011 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f"] Jan 20 19:05:29 crc kubenswrapper[4773]: E0120 19:05:29.492945 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="extract-content" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.492959 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="extract-content" Jan 20 19:05:29 crc kubenswrapper[4773]: E0120 19:05:29.492985 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="extract-utilities" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.492992 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="extract-utilities" Jan 20 19:05:29 crc kubenswrapper[4773]: E0120 19:05:29.493010 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="registry-server" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.493018 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="registry-server" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.493177 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="461d1293-e47b-49e3-a6eb-bef90ad29792" containerName="registry-server" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.493719 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.497269 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.497391 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.497505 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.497648 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.497776 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.516282 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f"] Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.626895 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjxgl\" (UniqueName: \"kubernetes.io/projected/2ce9c199-1f55-4aea-82f7-5df21339c927-kube-api-access-mjxgl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.627127 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.627164 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.627295 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.627329 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.729567 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.729646 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.729920 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.730034 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.730243 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjxgl\" (UniqueName: \"kubernetes.io/projected/2ce9c199-1f55-4aea-82f7-5df21339c927-kube-api-access-mjxgl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.736775 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.737295 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.737367 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ceph\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.738373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.752660 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjxgl\" (UniqueName: \"kubernetes.io/projected/2ce9c199-1f55-4aea-82f7-5df21339c927-kube-api-access-mjxgl\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:29 crc kubenswrapper[4773]: I0120 19:05:29.829648 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:30 crc kubenswrapper[4773]: I0120 19:05:30.376534 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f"] Jan 20 19:05:31 crc kubenswrapper[4773]: I0120 19:05:31.048017 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" event={"ID":"2ce9c199-1f55-4aea-82f7-5df21339c927","Type":"ContainerStarted","Data":"372659c8842895c482d67b7bdb9f52a4ec7c84574d3e91a70ac602c3de73c56d"} Jan 20 19:05:32 crc kubenswrapper[4773]: I0120 19:05:32.060638 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" event={"ID":"2ce9c199-1f55-4aea-82f7-5df21339c927","Type":"ContainerStarted","Data":"80f8a58df81e876e3749166f9c41816150f83154bab6dc495dc5176c4bc6877b"} Jan 20 19:05:32 crc kubenswrapper[4773]: I0120 19:05:32.076899 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" podStartSLOduration=2.627683545 podStartE2EDuration="3.076851939s" podCreationTimestamp="2026-01-20 19:05:29 +0000 UTC" firstStartedPulling="2026-01-20 19:05:30.383873475 +0000 UTC m=+2123.305686499" lastFinishedPulling="2026-01-20 19:05:30.833041869 +0000 UTC m=+2123.754854893" observedRunningTime="2026-01-20 19:05:32.07438654 +0000 UTC m=+2124.996199584" watchObservedRunningTime="2026-01-20 19:05:32.076851939 +0000 UTC m=+2124.998664973" Jan 20 19:05:44 crc kubenswrapper[4773]: I0120 19:05:44.158365 4773 generic.go:334] "Generic (PLEG): container finished" podID="2ce9c199-1f55-4aea-82f7-5df21339c927" containerID="80f8a58df81e876e3749166f9c41816150f83154bab6dc495dc5176c4bc6877b" exitCode=0 Jan 20 19:05:44 crc kubenswrapper[4773]: I0120 19:05:44.158443 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" event={"ID":"2ce9c199-1f55-4aea-82f7-5df21339c927","Type":"ContainerDied","Data":"80f8a58df81e876e3749166f9c41816150f83154bab6dc495dc5176c4bc6877b"} Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.568725 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.744636 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ssh-key-openstack-edpm-ipam\") pod \"2ce9c199-1f55-4aea-82f7-5df21339c927\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.744885 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ceph\") pod \"2ce9c199-1f55-4aea-82f7-5df21339c927\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.744966 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mjxgl\" (UniqueName: \"kubernetes.io/projected/2ce9c199-1f55-4aea-82f7-5df21339c927-kube-api-access-mjxgl\") pod \"2ce9c199-1f55-4aea-82f7-5df21339c927\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.745046 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-inventory\") pod \"2ce9c199-1f55-4aea-82f7-5df21339c927\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.745081 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-repo-setup-combined-ca-bundle\") pod \"2ce9c199-1f55-4aea-82f7-5df21339c927\" (UID: \"2ce9c199-1f55-4aea-82f7-5df21339c927\") " Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.751308 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "2ce9c199-1f55-4aea-82f7-5df21339c927" (UID: "2ce9c199-1f55-4aea-82f7-5df21339c927"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.751807 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce9c199-1f55-4aea-82f7-5df21339c927-kube-api-access-mjxgl" (OuterVolumeSpecName: "kube-api-access-mjxgl") pod "2ce9c199-1f55-4aea-82f7-5df21339c927" (UID: "2ce9c199-1f55-4aea-82f7-5df21339c927"). InnerVolumeSpecName "kube-api-access-mjxgl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.758109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ceph" (OuterVolumeSpecName: "ceph") pod "2ce9c199-1f55-4aea-82f7-5df21339c927" (UID: "2ce9c199-1f55-4aea-82f7-5df21339c927"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.774350 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-inventory" (OuterVolumeSpecName: "inventory") pod "2ce9c199-1f55-4aea-82f7-5df21339c927" (UID: "2ce9c199-1f55-4aea-82f7-5df21339c927"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.780168 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2ce9c199-1f55-4aea-82f7-5df21339c927" (UID: "2ce9c199-1f55-4aea-82f7-5df21339c927"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.849301 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.849342 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.849394 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mjxgl\" (UniqueName: \"kubernetes.io/projected/2ce9c199-1f55-4aea-82f7-5df21339c927-kube-api-access-mjxgl\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.849409 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:45 crc kubenswrapper[4773]: I0120 19:05:45.849421 4773 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ce9c199-1f55-4aea-82f7-5df21339c927-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.186083 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" event={"ID":"2ce9c199-1f55-4aea-82f7-5df21339c927","Type":"ContainerDied","Data":"372659c8842895c482d67b7bdb9f52a4ec7c84574d3e91a70ac602c3de73c56d"} Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.186131 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="372659c8842895c482d67b7bdb9f52a4ec7c84574d3e91a70ac602c3de73c56d" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.186194 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.261413 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8"] Jan 20 19:05:46 crc kubenswrapper[4773]: E0120 19:05:46.262094 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ce9c199-1f55-4aea-82f7-5df21339c927" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.262120 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce9c199-1f55-4aea-82f7-5df21339c927" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.262473 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ce9c199-1f55-4aea-82f7-5df21339c927" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.263385 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.265881 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.266089 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.266372 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.268890 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.268982 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.275859 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8"] Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.358974 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.359018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.359083 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.359113 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.359237 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4rdv\" (UniqueName: \"kubernetes.io/projected/586f1b07-ae25-4acf-8a65-92377c4db234-kube-api-access-z4rdv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.460441 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4rdv\" (UniqueName: \"kubernetes.io/projected/586f1b07-ae25-4acf-8a65-92377c4db234-kube-api-access-z4rdv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.460524 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.460546 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.460606 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.460637 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.464337 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.473451 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.473550 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.473498 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ceph\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.476214 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4rdv\" (UniqueName: \"kubernetes.io/projected/586f1b07-ae25-4acf-8a65-92377c4db234-kube-api-access-z4rdv\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:46 crc kubenswrapper[4773]: I0120 19:05:46.595768 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:05:47 crc kubenswrapper[4773]: I0120 19:05:47.095818 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8"] Jan 20 19:05:47 crc kubenswrapper[4773]: I0120 19:05:47.194038 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" event={"ID":"586f1b07-ae25-4acf-8a65-92377c4db234","Type":"ContainerStarted","Data":"f8aadc01018737dddc771e307839242b89fa962c335d9987e2632713a8b55216"} Jan 20 19:05:48 crc kubenswrapper[4773]: I0120 19:05:48.203549 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" event={"ID":"586f1b07-ae25-4acf-8a65-92377c4db234","Type":"ContainerStarted","Data":"788cab7679ab106e9f67ece33e92617bd54b57dc7823e0929c8c86dc34e0eb87"} Jan 20 19:05:48 crc kubenswrapper[4773]: I0120 19:05:48.230164 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" podStartSLOduration=1.742245094 podStartE2EDuration="2.230143202s" podCreationTimestamp="2026-01-20 19:05:46 +0000 UTC" firstStartedPulling="2026-01-20 19:05:47.104429073 +0000 UTC m=+2140.026242107" lastFinishedPulling="2026-01-20 19:05:47.592327191 +0000 UTC m=+2140.514140215" observedRunningTime="2026-01-20 19:05:48.224052623 +0000 UTC m=+2141.145865647" watchObservedRunningTime="2026-01-20 19:05:48.230143202 +0000 UTC m=+2141.151956246" Jan 20 19:06:19 crc kubenswrapper[4773]: I0120 19:06:19.773691 4773 scope.go:117] "RemoveContainer" containerID="275b7cfb99fe82d1a980bcea4ca6dc36e2c577eb26c2063aabdcc1bc630d378d" Jan 20 19:06:19 crc kubenswrapper[4773]: I0120 19:06:19.825781 4773 scope.go:117] "RemoveContainer" containerID="a3d1b2b32625905740058e821f111f168782e4d219803ce3813dd2b812563a2d" Jan 20 19:06:19 crc kubenswrapper[4773]: I0120 19:06:19.867120 4773 scope.go:117] "RemoveContainer" containerID="c8f7b97714b45d74f4b227ab1c61259d6940c0dff27c49881ec4be2e3c0092a6" Jan 20 19:06:19 crc kubenswrapper[4773]: I0120 19:06:19.909349 4773 scope.go:117] "RemoveContainer" containerID="8e156fca405116f5d486fe65823e78cd8f316ad936e8607162db09c90867f261" Jan 20 19:06:19 crc kubenswrapper[4773]: I0120 19:06:19.937697 4773 scope.go:117] "RemoveContainer" containerID="214ad8a794229d4fa466b745ad6c2c5b25a4a6f01698090bc0fd588f1017d53b" Jan 20 19:06:19 crc kubenswrapper[4773]: I0120 19:06:19.976053 4773 scope.go:117] "RemoveContainer" containerID="d2a63070e513c4ff5a517b705dbd2671d5a58b1f4c876fcd619b5653b8ff1cdb" Jan 20 19:06:20 crc kubenswrapper[4773]: I0120 19:06:20.016496 4773 scope.go:117] "RemoveContainer" containerID="93a7eebb7e51a278bd173d39037d247ae953e7e362f3c91fbf6d9626e5cde308" Jan 20 19:06:20 crc kubenswrapper[4773]: I0120 19:06:20.054194 4773 scope.go:117] "RemoveContainer" containerID="ed1b733b25627aeb8a223630419832be2d7010197f1a7fbda2c35be60b1cd9f1" Jan 20 19:06:20 crc kubenswrapper[4773]: I0120 19:06:20.121368 4773 scope.go:117] "RemoveContainer" containerID="55661dbd2f275de065e0d3e995f3bc1225e2c652531b5ee1235ad9050f245384" Jan 20 19:06:28 crc kubenswrapper[4773]: I0120 19:06:28.169672 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:06:28 crc kubenswrapper[4773]: I0120 19:06:28.171114 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:06:58 crc kubenswrapper[4773]: I0120 19:06:58.170372 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:06:58 crc kubenswrapper[4773]: I0120 19:06:58.171015 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:07:20 crc kubenswrapper[4773]: I0120 19:07:20.272518 4773 scope.go:117] "RemoveContainer" containerID="737aaa15d5f9c841779418b751493428e37f357fe0fe9835945320528fe4b9df" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.586798 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lnf6n"] Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.599952 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnf6n"] Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.600034 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.705842 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd8ll\" (UniqueName: \"kubernetes.io/projected/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-kube-api-access-vd8ll\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.706142 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-utilities\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.706339 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-catalog-content\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.808218 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-catalog-content\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.808315 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vd8ll\" (UniqueName: \"kubernetes.io/projected/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-kube-api-access-vd8ll\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.808339 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-utilities\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.808854 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-catalog-content\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.808878 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-utilities\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.828130 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd8ll\" (UniqueName: \"kubernetes.io/projected/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-kube-api-access-vd8ll\") pod \"community-operators-lnf6n\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:21 crc kubenswrapper[4773]: I0120 19:07:21.925665 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:22 crc kubenswrapper[4773]: I0120 19:07:22.459686 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lnf6n"] Jan 20 19:07:22 crc kubenswrapper[4773]: I0120 19:07:22.953466 4773 generic.go:334] "Generic (PLEG): container finished" podID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerID="25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37" exitCode=0 Jan 20 19:07:22 crc kubenswrapper[4773]: I0120 19:07:22.953560 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnf6n" event={"ID":"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6","Type":"ContainerDied","Data":"25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37"} Jan 20 19:07:22 crc kubenswrapper[4773]: I0120 19:07:22.953809 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnf6n" event={"ID":"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6","Type":"ContainerStarted","Data":"2b426e4d44dd06eedea970e43035e827898b4744cce96b3f32658e4da9d10aa5"} Jan 20 19:07:23 crc kubenswrapper[4773]: I0120 19:07:23.962698 4773 generic.go:334] "Generic (PLEG): container finished" podID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerID="27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5" exitCode=0 Jan 20 19:07:23 crc kubenswrapper[4773]: I0120 19:07:23.962786 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnf6n" event={"ID":"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6","Type":"ContainerDied","Data":"27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5"} Jan 20 19:07:24 crc kubenswrapper[4773]: I0120 19:07:24.972172 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnf6n" event={"ID":"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6","Type":"ContainerStarted","Data":"44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5"} Jan 20 19:07:24 crc kubenswrapper[4773]: I0120 19:07:24.994871 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lnf6n" podStartSLOduration=2.386128461 podStartE2EDuration="3.994846616s" podCreationTimestamp="2026-01-20 19:07:21 +0000 UTC" firstStartedPulling="2026-01-20 19:07:22.955973376 +0000 UTC m=+2235.877786400" lastFinishedPulling="2026-01-20 19:07:24.564691531 +0000 UTC m=+2237.486504555" observedRunningTime="2026-01-20 19:07:24.987542426 +0000 UTC m=+2237.909355470" watchObservedRunningTime="2026-01-20 19:07:24.994846616 +0000 UTC m=+2237.916659640" Jan 20 19:07:26 crc kubenswrapper[4773]: I0120 19:07:26.986242 4773 generic.go:334] "Generic (PLEG): container finished" podID="586f1b07-ae25-4acf-8a65-92377c4db234" containerID="788cab7679ab106e9f67ece33e92617bd54b57dc7823e0929c8c86dc34e0eb87" exitCode=0 Jan 20 19:07:26 crc kubenswrapper[4773]: I0120 19:07:26.986333 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" event={"ID":"586f1b07-ae25-4acf-8a65-92377c4db234","Type":"ContainerDied","Data":"788cab7679ab106e9f67ece33e92617bd54b57dc7823e0929c8c86dc34e0eb87"} Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.196308 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.196391 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.196441 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.197484 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.197545 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" gracePeriod=600 Jan 20 19:07:28 crc kubenswrapper[4773]: E0120 19:07:28.322776 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.405441 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.537965 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ssh-key-openstack-edpm-ipam\") pod \"586f1b07-ae25-4acf-8a65-92377c4db234\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.538076 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4rdv\" (UniqueName: \"kubernetes.io/projected/586f1b07-ae25-4acf-8a65-92377c4db234-kube-api-access-z4rdv\") pod \"586f1b07-ae25-4acf-8a65-92377c4db234\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.538130 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-inventory\") pod \"586f1b07-ae25-4acf-8a65-92377c4db234\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.538257 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-bootstrap-combined-ca-bundle\") pod \"586f1b07-ae25-4acf-8a65-92377c4db234\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.538307 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ceph\") pod \"586f1b07-ae25-4acf-8a65-92377c4db234\" (UID: \"586f1b07-ae25-4acf-8a65-92377c4db234\") " Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.543685 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ceph" (OuterVolumeSpecName: "ceph") pod "586f1b07-ae25-4acf-8a65-92377c4db234" (UID: "586f1b07-ae25-4acf-8a65-92377c4db234"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.543796 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "586f1b07-ae25-4acf-8a65-92377c4db234" (UID: "586f1b07-ae25-4acf-8a65-92377c4db234"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.544069 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/586f1b07-ae25-4acf-8a65-92377c4db234-kube-api-access-z4rdv" (OuterVolumeSpecName: "kube-api-access-z4rdv") pod "586f1b07-ae25-4acf-8a65-92377c4db234" (UID: "586f1b07-ae25-4acf-8a65-92377c4db234"). InnerVolumeSpecName "kube-api-access-z4rdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.562687 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "586f1b07-ae25-4acf-8a65-92377c4db234" (UID: "586f1b07-ae25-4acf-8a65-92377c4db234"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.563280 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-inventory" (OuterVolumeSpecName: "inventory") pod "586f1b07-ae25-4acf-8a65-92377c4db234" (UID: "586f1b07-ae25-4acf-8a65-92377c4db234"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.640087 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4rdv\" (UniqueName: \"kubernetes.io/projected/586f1b07-ae25-4acf-8a65-92377c4db234-kube-api-access-z4rdv\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.640115 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.640125 4773 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.640134 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:28 crc kubenswrapper[4773]: I0120 19:07:28.640142 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/586f1b07-ae25-4acf-8a65-92377c4db234-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.010008 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" exitCode=0 Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.010070 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1"} Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.010379 4773 scope.go:117] "RemoveContainer" containerID="7ef51181ae4526cf4dd383022f5cd069e75cd47240d82e55ea2f8b59a5c7eef7" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.011159 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:07:29 crc kubenswrapper[4773]: E0120 19:07:29.011593 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.013656 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" event={"ID":"586f1b07-ae25-4acf-8a65-92377c4db234","Type":"ContainerDied","Data":"f8aadc01018737dddc771e307839242b89fa962c335d9987e2632713a8b55216"} Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.013694 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8aadc01018737dddc771e307839242b89fa962c335d9987e2632713a8b55216" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.015055 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.107387 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm"] Jan 20 19:07:29 crc kubenswrapper[4773]: E0120 19:07:29.107880 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="586f1b07-ae25-4acf-8a65-92377c4db234" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.107904 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="586f1b07-ae25-4acf-8a65-92377c4db234" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.108143 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="586f1b07-ae25-4acf-8a65-92377c4db234" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.108946 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.113488 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.113745 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.113880 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.114171 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.116591 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.121863 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm"] Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.147585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.147639 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.147666 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.147685 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htg4g\" (UniqueName: \"kubernetes.io/projected/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-kube-api-access-htg4g\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.249608 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.249683 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.249711 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.249733 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htg4g\" (UniqueName: \"kubernetes.io/projected/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-kube-api-access-htg4g\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.254146 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ceph\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.254358 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.255499 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.264595 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htg4g\" (UniqueName: \"kubernetes.io/projected/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-kube-api-access-htg4g\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.438517 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:29 crc kubenswrapper[4773]: I0120 19:07:29.951849 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm"] Jan 20 19:07:30 crc kubenswrapper[4773]: I0120 19:07:30.026435 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" event={"ID":"a290d892-d26b-4f1c-b4a0-9778e6b58c7b","Type":"ContainerStarted","Data":"eb23a023b2ee7db79ca1f3f19c6a08cc980f5ebaa1760abf62846d8e16bae762"} Jan 20 19:07:31 crc kubenswrapper[4773]: I0120 19:07:31.037304 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" event={"ID":"a290d892-d26b-4f1c-b4a0-9778e6b58c7b","Type":"ContainerStarted","Data":"2443e916717c02c1bd3b3f4d20d0235dee65b1d8f082a086a4c2b9b4fd541514"} Jan 20 19:07:31 crc kubenswrapper[4773]: I0120 19:07:31.061597 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" podStartSLOduration=1.547008983 podStartE2EDuration="2.061575554s" podCreationTimestamp="2026-01-20 19:07:29 +0000 UTC" firstStartedPulling="2026-01-20 19:07:29.963334169 +0000 UTC m=+2242.885147223" lastFinishedPulling="2026-01-20 19:07:30.47790077 +0000 UTC m=+2243.399713794" observedRunningTime="2026-01-20 19:07:31.055829524 +0000 UTC m=+2243.977642588" watchObservedRunningTime="2026-01-20 19:07:31.061575554 +0000 UTC m=+2243.983388588" Jan 20 19:07:31 crc kubenswrapper[4773]: I0120 19:07:31.926531 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:31 crc kubenswrapper[4773]: I0120 19:07:31.926807 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:31 crc kubenswrapper[4773]: I0120 19:07:31.969464 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:32 crc kubenswrapper[4773]: I0120 19:07:32.086638 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:32 crc kubenswrapper[4773]: I0120 19:07:32.209545 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnf6n"] Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.069392 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-lnf6n" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="registry-server" containerID="cri-o://44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5" gracePeriod=2 Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.488787 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.571147 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd8ll\" (UniqueName: \"kubernetes.io/projected/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-kube-api-access-vd8ll\") pod \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.571318 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-utilities\") pod \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.571536 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-catalog-content\") pod \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\" (UID: \"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6\") " Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.575627 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-utilities" (OuterVolumeSpecName: "utilities") pod "c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" (UID: "c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.576683 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-kube-api-access-vd8ll" (OuterVolumeSpecName: "kube-api-access-vd8ll") pod "c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" (UID: "c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6"). InnerVolumeSpecName "kube-api-access-vd8ll". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.658865 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" (UID: "c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.673903 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vd8ll\" (UniqueName: \"kubernetes.io/projected/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-kube-api-access-vd8ll\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.673962 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:34 crc kubenswrapper[4773]: I0120 19:07:34.673972 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.079858 4773 generic.go:334] "Generic (PLEG): container finished" podID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerID="44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5" exitCode=0 Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.079898 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnf6n" event={"ID":"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6","Type":"ContainerDied","Data":"44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5"} Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.079923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lnf6n" event={"ID":"c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6","Type":"ContainerDied","Data":"2b426e4d44dd06eedea970e43035e827898b4744cce96b3f32658e4da9d10aa5"} Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.079942 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lnf6n" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.079958 4773 scope.go:117] "RemoveContainer" containerID="44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.104366 4773 scope.go:117] "RemoveContainer" containerID="27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.113365 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-lnf6n"] Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.121357 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-lnf6n"] Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.137515 4773 scope.go:117] "RemoveContainer" containerID="25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.164572 4773 scope.go:117] "RemoveContainer" containerID="44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5" Jan 20 19:07:35 crc kubenswrapper[4773]: E0120 19:07:35.165080 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5\": container with ID starting with 44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5 not found: ID does not exist" containerID="44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.165123 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5"} err="failed to get container status \"44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5\": rpc error: code = NotFound desc = could not find container \"44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5\": container with ID starting with 44fb6a259730fcc96a908e63359f902f239ef78464d1d23097c3a877aca833b5 not found: ID does not exist" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.165171 4773 scope.go:117] "RemoveContainer" containerID="27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5" Jan 20 19:07:35 crc kubenswrapper[4773]: E0120 19:07:35.165485 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5\": container with ID starting with 27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5 not found: ID does not exist" containerID="27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.165525 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5"} err="failed to get container status \"27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5\": rpc error: code = NotFound desc = could not find container \"27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5\": container with ID starting with 27cb2fd4550ab340b9204ac2b73b096dde4130e7740124836ee5a1737a850ae5 not found: ID does not exist" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.165549 4773 scope.go:117] "RemoveContainer" containerID="25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37" Jan 20 19:07:35 crc kubenswrapper[4773]: E0120 19:07:35.165761 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37\": container with ID starting with 25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37 not found: ID does not exist" containerID="25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.165801 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37"} err="failed to get container status \"25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37\": rpc error: code = NotFound desc = could not find container \"25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37\": container with ID starting with 25b86ebf8db6229a69f09e654ab100e437dde7ed70aaa669f7cc8e3639a20c37 not found: ID does not exist" Jan 20 19:07:35 crc kubenswrapper[4773]: I0120 19:07:35.461645 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" path="/var/lib/kubelet/pods/c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6/volumes" Jan 20 19:07:42 crc kubenswrapper[4773]: I0120 19:07:42.449669 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:07:42 crc kubenswrapper[4773]: E0120 19:07:42.450492 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:07:54 crc kubenswrapper[4773]: I0120 19:07:54.447918 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:07:54 crc kubenswrapper[4773]: E0120 19:07:54.450237 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:07:55 crc kubenswrapper[4773]: I0120 19:07:55.276842 4773 generic.go:334] "Generic (PLEG): container finished" podID="a290d892-d26b-4f1c-b4a0-9778e6b58c7b" containerID="2443e916717c02c1bd3b3f4d20d0235dee65b1d8f082a086a4c2b9b4fd541514" exitCode=0 Jan 20 19:07:55 crc kubenswrapper[4773]: I0120 19:07:55.276884 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" event={"ID":"a290d892-d26b-4f1c-b4a0-9778e6b58c7b","Type":"ContainerDied","Data":"2443e916717c02c1bd3b3f4d20d0235dee65b1d8f082a086a4c2b9b4fd541514"} Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.689355 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.781970 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ssh-key-openstack-edpm-ipam\") pod \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.782046 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htg4g\" (UniqueName: \"kubernetes.io/projected/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-kube-api-access-htg4g\") pod \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.782096 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ceph\") pod \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.782232 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-inventory\") pod \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\" (UID: \"a290d892-d26b-4f1c-b4a0-9778e6b58c7b\") " Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.787766 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-kube-api-access-htg4g" (OuterVolumeSpecName: "kube-api-access-htg4g") pod "a290d892-d26b-4f1c-b4a0-9778e6b58c7b" (UID: "a290d892-d26b-4f1c-b4a0-9778e6b58c7b"). InnerVolumeSpecName "kube-api-access-htg4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.788069 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ceph" (OuterVolumeSpecName: "ceph") pod "a290d892-d26b-4f1c-b4a0-9778e6b58c7b" (UID: "a290d892-d26b-4f1c-b4a0-9778e6b58c7b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.808513 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a290d892-d26b-4f1c-b4a0-9778e6b58c7b" (UID: "a290d892-d26b-4f1c-b4a0-9778e6b58c7b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.811069 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-inventory" (OuterVolumeSpecName: "inventory") pod "a290d892-d26b-4f1c-b4a0-9778e6b58c7b" (UID: "a290d892-d26b-4f1c-b4a0-9778e6b58c7b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.883777 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.883806 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.883820 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:56 crc kubenswrapper[4773]: I0120 19:07:56.883833 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htg4g\" (UniqueName: \"kubernetes.io/projected/a290d892-d26b-4f1c-b4a0-9778e6b58c7b-kube-api-access-htg4g\") on node \"crc\" DevicePath \"\"" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.310572 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" event={"ID":"a290d892-d26b-4f1c-b4a0-9778e6b58c7b","Type":"ContainerDied","Data":"eb23a023b2ee7db79ca1f3f19c6a08cc980f5ebaa1760abf62846d8e16bae762"} Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.310642 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb23a023b2ee7db79ca1f3f19c6a08cc980f5ebaa1760abf62846d8e16bae762" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.310733 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.382255 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl"] Jan 20 19:07:57 crc kubenswrapper[4773]: E0120 19:07:57.383255 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="registry-server" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.383279 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="registry-server" Jan 20 19:07:57 crc kubenswrapper[4773]: E0120 19:07:57.383299 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="extract-content" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.383306 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="extract-content" Jan 20 19:07:57 crc kubenswrapper[4773]: E0120 19:07:57.383324 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a290d892-d26b-4f1c-b4a0-9778e6b58c7b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.383336 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a290d892-d26b-4f1c-b4a0-9778e6b58c7b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 20 19:07:57 crc kubenswrapper[4773]: E0120 19:07:57.383351 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="extract-utilities" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.383359 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="extract-utilities" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.383611 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a290d892-d26b-4f1c-b4a0-9778e6b58c7b" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.383634 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c39f1bba-5f32-4ba9-a7a5-2cacaf0b89a6" containerName="registry-server" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.384438 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.389679 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.390024 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.390170 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.393238 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.393378 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.398542 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl"] Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.501297 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.501376 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.501395 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59kzd\" (UniqueName: \"kubernetes.io/projected/9114481d-74c0-4af1-9bed-3f592f2c102f-kube-api-access-59kzd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.501481 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.603647 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59kzd\" (UniqueName: \"kubernetes.io/projected/9114481d-74c0-4af1-9bed-3f592f2c102f-kube-api-access-59kzd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.604102 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.604467 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.604800 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.609840 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.610075 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.611881 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ceph\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.622905 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59kzd\" (UniqueName: \"kubernetes.io/projected/9114481d-74c0-4af1-9bed-3f592f2c102f-kube-api-access-59kzd\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:57 crc kubenswrapper[4773]: I0120 19:07:57.703697 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:07:58 crc kubenswrapper[4773]: I0120 19:07:58.216295 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl"] Jan 20 19:07:58 crc kubenswrapper[4773]: I0120 19:07:58.319515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" event={"ID":"9114481d-74c0-4af1-9bed-3f592f2c102f","Type":"ContainerStarted","Data":"84ccae1731492f36f27032bf4fdcd471087a7fba072bc56caf5eecbcef82796f"} Jan 20 19:07:59 crc kubenswrapper[4773]: I0120 19:07:59.332427 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" event={"ID":"9114481d-74c0-4af1-9bed-3f592f2c102f","Type":"ContainerStarted","Data":"19321b41aa7d9fc5a04b9a1c384913e6b0d9ca3c0b752d366cdf04cfc8e54db0"} Jan 20 19:07:59 crc kubenswrapper[4773]: I0120 19:07:59.357068 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" podStartSLOduration=1.792651108 podStartE2EDuration="2.35704771s" podCreationTimestamp="2026-01-20 19:07:57 +0000 UTC" firstStartedPulling="2026-01-20 19:07:58.22107237 +0000 UTC m=+2271.142885394" lastFinishedPulling="2026-01-20 19:07:58.785468972 +0000 UTC m=+2271.707281996" observedRunningTime="2026-01-20 19:07:59.349145236 +0000 UTC m=+2272.270958290" watchObservedRunningTime="2026-01-20 19:07:59.35704771 +0000 UTC m=+2272.278860734" Jan 20 19:08:04 crc kubenswrapper[4773]: I0120 19:08:04.372311 4773 generic.go:334] "Generic (PLEG): container finished" podID="9114481d-74c0-4af1-9bed-3f592f2c102f" containerID="19321b41aa7d9fc5a04b9a1c384913e6b0d9ca3c0b752d366cdf04cfc8e54db0" exitCode=0 Jan 20 19:08:04 crc kubenswrapper[4773]: I0120 19:08:04.372426 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" event={"ID":"9114481d-74c0-4af1-9bed-3f592f2c102f","Type":"ContainerDied","Data":"19321b41aa7d9fc5a04b9a1c384913e6b0d9ca3c0b752d366cdf04cfc8e54db0"} Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.774560 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.958904 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ssh-key-openstack-edpm-ipam\") pod \"9114481d-74c0-4af1-9bed-3f592f2c102f\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.959063 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-59kzd\" (UniqueName: \"kubernetes.io/projected/9114481d-74c0-4af1-9bed-3f592f2c102f-kube-api-access-59kzd\") pod \"9114481d-74c0-4af1-9bed-3f592f2c102f\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.959116 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ceph\") pod \"9114481d-74c0-4af1-9bed-3f592f2c102f\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.959207 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-inventory\") pod \"9114481d-74c0-4af1-9bed-3f592f2c102f\" (UID: \"9114481d-74c0-4af1-9bed-3f592f2c102f\") " Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.964657 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ceph" (OuterVolumeSpecName: "ceph") pod "9114481d-74c0-4af1-9bed-3f592f2c102f" (UID: "9114481d-74c0-4af1-9bed-3f592f2c102f"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.966425 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9114481d-74c0-4af1-9bed-3f592f2c102f-kube-api-access-59kzd" (OuterVolumeSpecName: "kube-api-access-59kzd") pod "9114481d-74c0-4af1-9bed-3f592f2c102f" (UID: "9114481d-74c0-4af1-9bed-3f592f2c102f"). InnerVolumeSpecName "kube-api-access-59kzd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:08:05 crc kubenswrapper[4773]: I0120 19:08:05.984618 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-inventory" (OuterVolumeSpecName: "inventory") pod "9114481d-74c0-4af1-9bed-3f592f2c102f" (UID: "9114481d-74c0-4af1-9bed-3f592f2c102f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.009328 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "9114481d-74c0-4af1-9bed-3f592f2c102f" (UID: "9114481d-74c0-4af1-9bed-3f592f2c102f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.061315 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.061360 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-59kzd\" (UniqueName: \"kubernetes.io/projected/9114481d-74c0-4af1-9bed-3f592f2c102f-kube-api-access-59kzd\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.061373 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.061384 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/9114481d-74c0-4af1-9bed-3f592f2c102f-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.388303 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" event={"ID":"9114481d-74c0-4af1-9bed-3f592f2c102f","Type":"ContainerDied","Data":"84ccae1731492f36f27032bf4fdcd471087a7fba072bc56caf5eecbcef82796f"} Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.388339 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="84ccae1731492f36f27032bf4fdcd471087a7fba072bc56caf5eecbcef82796f" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.388653 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.466172 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl"] Jan 20 19:08:06 crc kubenswrapper[4773]: E0120 19:08:06.466587 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9114481d-74c0-4af1-9bed-3f592f2c102f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.466613 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="9114481d-74c0-4af1-9bed-3f592f2c102f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.466847 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="9114481d-74c0-4af1-9bed-3f592f2c102f" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.474439 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.477375 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.477823 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.478294 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.479054 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.479409 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.479626 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl"] Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.570628 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.570789 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.571154 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.571238 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7c8z4\" (UniqueName: \"kubernetes.io/projected/00dc0471-09f0-4cdf-a237-aba1d232cf04-kube-api-access-7c8z4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.673581 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.674028 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7c8z4\" (UniqueName: \"kubernetes.io/projected/00dc0471-09f0-4cdf-a237-aba1d232cf04-kube-api-access-7c8z4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.674483 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.676526 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.677562 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.679360 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.681395 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ceph\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.692427 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7c8z4\" (UniqueName: \"kubernetes.io/projected/00dc0471-09f0-4cdf-a237-aba1d232cf04-kube-api-access-7c8z4\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-h7lxl\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:06 crc kubenswrapper[4773]: I0120 19:08:06.795705 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:07 crc kubenswrapper[4773]: I0120 19:08:07.316246 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl"] Jan 20 19:08:07 crc kubenswrapper[4773]: I0120 19:08:07.396496 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" event={"ID":"00dc0471-09f0-4cdf-a237-aba1d232cf04","Type":"ContainerStarted","Data":"6667a757ef94bda49015d31ad7f83f3febee632cb2bdbfa4b633256b03dab1fe"} Jan 20 19:08:07 crc kubenswrapper[4773]: I0120 19:08:07.455198 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:08:07 crc kubenswrapper[4773]: E0120 19:08:07.455436 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:08:07 crc kubenswrapper[4773]: I0120 19:08:07.833519 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:08:08 crc kubenswrapper[4773]: I0120 19:08:08.405745 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" event={"ID":"00dc0471-09f0-4cdf-a237-aba1d232cf04","Type":"ContainerStarted","Data":"016b7161255906b165381005e4b64f481adda489350b3e8c81d4fc7dd7ae683a"} Jan 20 19:08:08 crc kubenswrapper[4773]: I0120 19:08:08.429386 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" podStartSLOduration=1.927866182 podStartE2EDuration="2.429360363s" podCreationTimestamp="2026-01-20 19:08:06 +0000 UTC" firstStartedPulling="2026-01-20 19:08:07.329379006 +0000 UTC m=+2280.251192030" lastFinishedPulling="2026-01-20 19:08:07.830873187 +0000 UTC m=+2280.752686211" observedRunningTime="2026-01-20 19:08:08.422574146 +0000 UTC m=+2281.344387260" watchObservedRunningTime="2026-01-20 19:08:08.429360363 +0000 UTC m=+2281.351173397" Jan 20 19:08:20 crc kubenswrapper[4773]: I0120 19:08:20.447636 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:08:20 crc kubenswrapper[4773]: E0120 19:08:20.448777 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:08:33 crc kubenswrapper[4773]: I0120 19:08:33.447164 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:08:33 crc kubenswrapper[4773]: E0120 19:08:33.449341 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:08:43 crc kubenswrapper[4773]: I0120 19:08:43.703927 4773 generic.go:334] "Generic (PLEG): container finished" podID="00dc0471-09f0-4cdf-a237-aba1d232cf04" containerID="016b7161255906b165381005e4b64f481adda489350b3e8c81d4fc7dd7ae683a" exitCode=0 Jan 20 19:08:43 crc kubenswrapper[4773]: I0120 19:08:43.703978 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" event={"ID":"00dc0471-09f0-4cdf-a237-aba1d232cf04","Type":"ContainerDied","Data":"016b7161255906b165381005e4b64f481adda489350b3e8c81d4fc7dd7ae683a"} Jan 20 19:08:44 crc kubenswrapper[4773]: I0120 19:08:44.446643 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:08:44 crc kubenswrapper[4773]: E0120 19:08:44.446926 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.148463 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.268686 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ceph\") pod \"00dc0471-09f0-4cdf-a237-aba1d232cf04\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.268776 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ssh-key-openstack-edpm-ipam\") pod \"00dc0471-09f0-4cdf-a237-aba1d232cf04\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.268811 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-inventory\") pod \"00dc0471-09f0-4cdf-a237-aba1d232cf04\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.269130 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c8z4\" (UniqueName: \"kubernetes.io/projected/00dc0471-09f0-4cdf-a237-aba1d232cf04-kube-api-access-7c8z4\") pod \"00dc0471-09f0-4cdf-a237-aba1d232cf04\" (UID: \"00dc0471-09f0-4cdf-a237-aba1d232cf04\") " Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.275259 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ceph" (OuterVolumeSpecName: "ceph") pod "00dc0471-09f0-4cdf-a237-aba1d232cf04" (UID: "00dc0471-09f0-4cdf-a237-aba1d232cf04"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.275710 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00dc0471-09f0-4cdf-a237-aba1d232cf04-kube-api-access-7c8z4" (OuterVolumeSpecName: "kube-api-access-7c8z4") pod "00dc0471-09f0-4cdf-a237-aba1d232cf04" (UID: "00dc0471-09f0-4cdf-a237-aba1d232cf04"). InnerVolumeSpecName "kube-api-access-7c8z4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.302847 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "00dc0471-09f0-4cdf-a237-aba1d232cf04" (UID: "00dc0471-09f0-4cdf-a237-aba1d232cf04"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.334539 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-inventory" (OuterVolumeSpecName: "inventory") pod "00dc0471-09f0-4cdf-a237-aba1d232cf04" (UID: "00dc0471-09f0-4cdf-a237-aba1d232cf04"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.372065 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.372150 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.372182 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/00dc0471-09f0-4cdf-a237-aba1d232cf04-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.372206 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c8z4\" (UniqueName: \"kubernetes.io/projected/00dc0471-09f0-4cdf-a237-aba1d232cf04-kube-api-access-7c8z4\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.726588 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" event={"ID":"00dc0471-09f0-4cdf-a237-aba1d232cf04","Type":"ContainerDied","Data":"6667a757ef94bda49015d31ad7f83f3febee632cb2bdbfa4b633256b03dab1fe"} Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.727019 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6667a757ef94bda49015d31ad7f83f3febee632cb2bdbfa4b633256b03dab1fe" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.726657 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-h7lxl" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.813857 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7"] Jan 20 19:08:45 crc kubenswrapper[4773]: E0120 19:08:45.814288 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="00dc0471-09f0-4cdf-a237-aba1d232cf04" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.814313 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="00dc0471-09f0-4cdf-a237-aba1d232cf04" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.814559 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="00dc0471-09f0-4cdf-a237-aba1d232cf04" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.815265 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.817788 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.817821 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.818065 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.818491 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.818668 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.825833 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7"] Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.982183 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqff7\" (UniqueName: \"kubernetes.io/projected/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-kube-api-access-lqff7\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.982243 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.982296 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:45 crc kubenswrapper[4773]: I0120 19:08:45.982430 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.084106 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.084233 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.084270 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lqff7\" (UniqueName: \"kubernetes.io/projected/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-kube-api-access-lqff7\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.084293 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.088752 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ssh-key-openstack-edpm-ipam\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.091356 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ceph\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.092296 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-inventory\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.106237 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqff7\" (UniqueName: \"kubernetes.io/projected/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-kube-api-access-lqff7\") pod \"ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.136715 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.720580 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7"] Jan 20 19:08:46 crc kubenswrapper[4773]: I0120 19:08:46.741646 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" event={"ID":"e1492d77-23f5-4ed0-9511-e5b4ee1107c7","Type":"ContainerStarted","Data":"592f2227e942f38ab81d1066b0a53c57a17c1f412459f7373db45e35dfd1771d"} Jan 20 19:08:47 crc kubenswrapper[4773]: I0120 19:08:47.750624 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" event={"ID":"e1492d77-23f5-4ed0-9511-e5b4ee1107c7","Type":"ContainerStarted","Data":"22471ae3f8f072cbd2a2544fc9553d7c28a9ea536f169a3ec909534c93bb48ba"} Jan 20 19:08:47 crc kubenswrapper[4773]: I0120 19:08:47.770542 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" podStartSLOduration=2.201763887 podStartE2EDuration="2.770520747s" podCreationTimestamp="2026-01-20 19:08:45 +0000 UTC" firstStartedPulling="2026-01-20 19:08:46.724715946 +0000 UTC m=+2319.646528970" lastFinishedPulling="2026-01-20 19:08:47.293472806 +0000 UTC m=+2320.215285830" observedRunningTime="2026-01-20 19:08:47.767422592 +0000 UTC m=+2320.689235686" watchObservedRunningTime="2026-01-20 19:08:47.770520747 +0000 UTC m=+2320.692333791" Jan 20 19:08:51 crc kubenswrapper[4773]: I0120 19:08:51.785704 4773 generic.go:334] "Generic (PLEG): container finished" podID="e1492d77-23f5-4ed0-9511-e5b4ee1107c7" containerID="22471ae3f8f072cbd2a2544fc9553d7c28a9ea536f169a3ec909534c93bb48ba" exitCode=0 Jan 20 19:08:51 crc kubenswrapper[4773]: I0120 19:08:51.785781 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" event={"ID":"e1492d77-23f5-4ed0-9511-e5b4ee1107c7","Type":"ContainerDied","Data":"22471ae3f8f072cbd2a2544fc9553d7c28a9ea536f169a3ec909534c93bb48ba"} Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.199738 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.335629 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ssh-key-openstack-edpm-ipam\") pod \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.335803 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ceph\") pod \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.335880 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-inventory\") pod \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.335961 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lqff7\" (UniqueName: \"kubernetes.io/projected/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-kube-api-access-lqff7\") pod \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\" (UID: \"e1492d77-23f5-4ed0-9511-e5b4ee1107c7\") " Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.341522 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ceph" (OuterVolumeSpecName: "ceph") pod "e1492d77-23f5-4ed0-9511-e5b4ee1107c7" (UID: "e1492d77-23f5-4ed0-9511-e5b4ee1107c7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.341832 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-kube-api-access-lqff7" (OuterVolumeSpecName: "kube-api-access-lqff7") pod "e1492d77-23f5-4ed0-9511-e5b4ee1107c7" (UID: "e1492d77-23f5-4ed0-9511-e5b4ee1107c7"). InnerVolumeSpecName "kube-api-access-lqff7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.365276 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e1492d77-23f5-4ed0-9511-e5b4ee1107c7" (UID: "e1492d77-23f5-4ed0-9511-e5b4ee1107c7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.367141 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-inventory" (OuterVolumeSpecName: "inventory") pod "e1492d77-23f5-4ed0-9511-e5b4ee1107c7" (UID: "e1492d77-23f5-4ed0-9511-e5b4ee1107c7"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.438516 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.438551 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.438565 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.438578 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lqff7\" (UniqueName: \"kubernetes.io/projected/e1492d77-23f5-4ed0-9511-e5b4ee1107c7-kube-api-access-lqff7\") on node \"crc\" DevicePath \"\"" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.806923 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" event={"ID":"e1492d77-23f5-4ed0-9511-e5b4ee1107c7","Type":"ContainerDied","Data":"592f2227e942f38ab81d1066b0a53c57a17c1f412459f7373db45e35dfd1771d"} Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.807204 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="592f2227e942f38ab81d1066b0a53c57a17c1f412459f7373db45e35dfd1771d" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.807014 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.867173 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5"] Jan 20 19:08:53 crc kubenswrapper[4773]: E0120 19:08:53.867568 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1492d77-23f5-4ed0-9511-e5b4ee1107c7" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.867585 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1492d77-23f5-4ed0-9511-e5b4ee1107c7" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.867755 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1492d77-23f5-4ed0-9511-e5b4ee1107c7" containerName="ceph-hci-pre-edpm-deployment-openstack-edpm-ipam" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.869459 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.872062 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.872102 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.872175 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.872718 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.872759 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:08:53 crc kubenswrapper[4773]: I0120 19:08:53.882565 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5"] Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.048854 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.049025 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.049211 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.049293 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-55gfr\" (UniqueName: \"kubernetes.io/projected/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-kube-api-access-55gfr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.150965 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.151357 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.151415 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.151469 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-55gfr\" (UniqueName: \"kubernetes.io/projected/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-kube-api-access-55gfr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.157508 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.160902 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.165482 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ceph\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.168437 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-55gfr\" (UniqueName: \"kubernetes.io/projected/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-kube-api-access-55gfr\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.195602 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.672713 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5"] Jan 20 19:08:54 crc kubenswrapper[4773]: W0120 19:08:54.688092 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb3ce8585_331b_44ef_b8f8_aa5cb3b96589.slice/crio-65dadb50def57c05ef7360ac8382345ee15f75ad2d6811b352babcf6612d0818 WatchSource:0}: Error finding container 65dadb50def57c05ef7360ac8382345ee15f75ad2d6811b352babcf6612d0818: Status 404 returned error can't find the container with id 65dadb50def57c05ef7360ac8382345ee15f75ad2d6811b352babcf6612d0818 Jan 20 19:08:54 crc kubenswrapper[4773]: I0120 19:08:54.816290 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" event={"ID":"b3ce8585-331b-44ef-b8f8-aa5cb3b96589","Type":"ContainerStarted","Data":"65dadb50def57c05ef7360ac8382345ee15f75ad2d6811b352babcf6612d0818"} Jan 20 19:08:55 crc kubenswrapper[4773]: I0120 19:08:55.825770 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" event={"ID":"b3ce8585-331b-44ef-b8f8-aa5cb3b96589","Type":"ContainerStarted","Data":"0acc32f3dd00ece65c362ed4f79e75fc710f060db3865b6e446c5263e2e0a67d"} Jan 20 19:08:55 crc kubenswrapper[4773]: I0120 19:08:55.845801 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" podStartSLOduration=2.339451274 podStartE2EDuration="2.845784413s" podCreationTimestamp="2026-01-20 19:08:53 +0000 UTC" firstStartedPulling="2026-01-20 19:08:54.693260669 +0000 UTC m=+2327.615073693" lastFinishedPulling="2026-01-20 19:08:55.199593808 +0000 UTC m=+2328.121406832" observedRunningTime="2026-01-20 19:08:55.839618822 +0000 UTC m=+2328.761431866" watchObservedRunningTime="2026-01-20 19:08:55.845784413 +0000 UTC m=+2328.767597437" Jan 20 19:08:58 crc kubenswrapper[4773]: I0120 19:08:58.446846 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:08:58 crc kubenswrapper[4773]: E0120 19:08:58.447420 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:09:12 crc kubenswrapper[4773]: I0120 19:09:12.447907 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:09:12 crc kubenswrapper[4773]: E0120 19:09:12.448731 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:09:24 crc kubenswrapper[4773]: I0120 19:09:24.447225 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:09:24 crc kubenswrapper[4773]: E0120 19:09:24.448073 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:09:36 crc kubenswrapper[4773]: I0120 19:09:36.110368 4773 generic.go:334] "Generic (PLEG): container finished" podID="b3ce8585-331b-44ef-b8f8-aa5cb3b96589" containerID="0acc32f3dd00ece65c362ed4f79e75fc710f060db3865b6e446c5263e2e0a67d" exitCode=0 Jan 20 19:09:36 crc kubenswrapper[4773]: I0120 19:09:36.110444 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" event={"ID":"b3ce8585-331b-44ef-b8f8-aa5cb3b96589","Type":"ContainerDied","Data":"0acc32f3dd00ece65c362ed4f79e75fc710f060db3865b6e446c5263e2e0a67d"} Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.541382 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.651472 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-inventory\") pod \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.651713 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-55gfr\" (UniqueName: \"kubernetes.io/projected/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-kube-api-access-55gfr\") pod \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.651759 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ssh-key-openstack-edpm-ipam\") pod \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.651798 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ceph\") pod \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\" (UID: \"b3ce8585-331b-44ef-b8f8-aa5cb3b96589\") " Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.663433 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-kube-api-access-55gfr" (OuterVolumeSpecName: "kube-api-access-55gfr") pod "b3ce8585-331b-44ef-b8f8-aa5cb3b96589" (UID: "b3ce8585-331b-44ef-b8f8-aa5cb3b96589"). InnerVolumeSpecName "kube-api-access-55gfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.664053 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ceph" (OuterVolumeSpecName: "ceph") pod "b3ce8585-331b-44ef-b8f8-aa5cb3b96589" (UID: "b3ce8585-331b-44ef-b8f8-aa5cb3b96589"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.676316 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-inventory" (OuterVolumeSpecName: "inventory") pod "b3ce8585-331b-44ef-b8f8-aa5cb3b96589" (UID: "b3ce8585-331b-44ef-b8f8-aa5cb3b96589"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.676820 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "b3ce8585-331b-44ef-b8f8-aa5cb3b96589" (UID: "b3ce8585-331b-44ef-b8f8-aa5cb3b96589"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.753436 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-55gfr\" (UniqueName: \"kubernetes.io/projected/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-kube-api-access-55gfr\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.753469 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.753480 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:37 crc kubenswrapper[4773]: I0120 19:09:37.753490 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/b3ce8585-331b-44ef-b8f8-aa5cb3b96589-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.127269 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" event={"ID":"b3ce8585-331b-44ef-b8f8-aa5cb3b96589","Type":"ContainerDied","Data":"65dadb50def57c05ef7360ac8382345ee15f75ad2d6811b352babcf6612d0818"} Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.127598 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="65dadb50def57c05ef7360ac8382345ee15f75ad2d6811b352babcf6612d0818" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.127336 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.235334 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ll277"] Jan 20 19:09:38 crc kubenswrapper[4773]: E0120 19:09:38.235687 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b3ce8585-331b-44ef-b8f8-aa5cb3b96589" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.235706 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b3ce8585-331b-44ef-b8f8-aa5cb3b96589" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.235870 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b3ce8585-331b-44ef-b8f8-aa5cb3b96589" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.236435 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.238540 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.238705 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.253898 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.254252 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.255475 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.285310 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ll277"] Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.366520 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ceph\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.366613 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.366708 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4q52\" (UniqueName: \"kubernetes.io/projected/08ee5bdf-bc91-4f34-8459-bc65419f93d7-kube-api-access-k4q52\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.366795 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.468778 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.468878 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ceph\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.468992 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.469037 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4q52\" (UniqueName: \"kubernetes.io/projected/08ee5bdf-bc91-4f34-8459-bc65419f93d7-kube-api-access-k4q52\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.473820 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.481579 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ceph\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.484563 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.485247 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4q52\" (UniqueName: \"kubernetes.io/projected/08ee5bdf-bc91-4f34-8459-bc65419f93d7-kube-api-access-k4q52\") pod \"ssh-known-hosts-edpm-deployment-ll277\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:38 crc kubenswrapper[4773]: I0120 19:09:38.572848 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:39 crc kubenswrapper[4773]: I0120 19:09:39.123712 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-ll277"] Jan 20 19:09:39 crc kubenswrapper[4773]: I0120 19:09:39.127611 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 19:09:39 crc kubenswrapper[4773]: I0120 19:09:39.136637 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" event={"ID":"08ee5bdf-bc91-4f34-8459-bc65419f93d7","Type":"ContainerStarted","Data":"d13ed8d40e9582ffcef7fc4c21289af702cbe628dae9f26e58eec56e94666466"} Jan 20 19:09:39 crc kubenswrapper[4773]: I0120 19:09:39.447051 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:09:39 crc kubenswrapper[4773]: E0120 19:09:39.447319 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:09:41 crc kubenswrapper[4773]: I0120 19:09:41.155173 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" event={"ID":"08ee5bdf-bc91-4f34-8459-bc65419f93d7","Type":"ContainerStarted","Data":"cfb2353f9c2f582ca875e48ff20e38dfbd1c2f94eb56472b1e9fca947ff25c9e"} Jan 20 19:09:41 crc kubenswrapper[4773]: I0120 19:09:41.180574 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" podStartSLOduration=2.649382788 podStartE2EDuration="3.180554343s" podCreationTimestamp="2026-01-20 19:09:38 +0000 UTC" firstStartedPulling="2026-01-20 19:09:39.127404205 +0000 UTC m=+2372.049217219" lastFinishedPulling="2026-01-20 19:09:39.65857576 +0000 UTC m=+2372.580388774" observedRunningTime="2026-01-20 19:09:41.176122657 +0000 UTC m=+2374.097935691" watchObservedRunningTime="2026-01-20 19:09:41.180554343 +0000 UTC m=+2374.102367367" Jan 20 19:09:49 crc kubenswrapper[4773]: I0120 19:09:49.231788 4773 generic.go:334] "Generic (PLEG): container finished" podID="08ee5bdf-bc91-4f34-8459-bc65419f93d7" containerID="cfb2353f9c2f582ca875e48ff20e38dfbd1c2f94eb56472b1e9fca947ff25c9e" exitCode=0 Jan 20 19:09:49 crc kubenswrapper[4773]: I0120 19:09:49.231894 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" event={"ID":"08ee5bdf-bc91-4f34-8459-bc65419f93d7","Type":"ContainerDied","Data":"cfb2353f9c2f582ca875e48ff20e38dfbd1c2f94eb56472b1e9fca947ff25c9e"} Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.637059 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.691560 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ceph\") pod \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.691713 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4q52\" (UniqueName: \"kubernetes.io/projected/08ee5bdf-bc91-4f34-8459-bc65419f93d7-kube-api-access-k4q52\") pod \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.691863 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ssh-key-openstack-edpm-ipam\") pod \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.691915 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-inventory-0\") pod \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\" (UID: \"08ee5bdf-bc91-4f34-8459-bc65419f93d7\") " Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.697351 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08ee5bdf-bc91-4f34-8459-bc65419f93d7-kube-api-access-k4q52" (OuterVolumeSpecName: "kube-api-access-k4q52") pod "08ee5bdf-bc91-4f34-8459-bc65419f93d7" (UID: "08ee5bdf-bc91-4f34-8459-bc65419f93d7"). InnerVolumeSpecName "kube-api-access-k4q52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.698216 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ceph" (OuterVolumeSpecName: "ceph") pod "08ee5bdf-bc91-4f34-8459-bc65419f93d7" (UID: "08ee5bdf-bc91-4f34-8459-bc65419f93d7"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.715979 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "08ee5bdf-bc91-4f34-8459-bc65419f93d7" (UID: "08ee5bdf-bc91-4f34-8459-bc65419f93d7"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.720997 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "08ee5bdf-bc91-4f34-8459-bc65419f93d7" (UID: "08ee5bdf-bc91-4f34-8459-bc65419f93d7"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.794198 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.794231 4773 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-inventory-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.794241 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/08ee5bdf-bc91-4f34-8459-bc65419f93d7-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:50 crc kubenswrapper[4773]: I0120 19:09:50.794249 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4q52\" (UniqueName: \"kubernetes.io/projected/08ee5bdf-bc91-4f34-8459-bc65419f93d7-kube-api-access-k4q52\") on node \"crc\" DevicePath \"\"" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.251297 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" event={"ID":"08ee5bdf-bc91-4f34-8459-bc65419f93d7","Type":"ContainerDied","Data":"d13ed8d40e9582ffcef7fc4c21289af702cbe628dae9f26e58eec56e94666466"} Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.251368 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d13ed8d40e9582ffcef7fc4c21289af702cbe628dae9f26e58eec56e94666466" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.251436 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-ll277" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.323684 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn"] Jan 20 19:09:51 crc kubenswrapper[4773]: E0120 19:09:51.324061 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="08ee5bdf-bc91-4f34-8459-bc65419f93d7" containerName="ssh-known-hosts-edpm-deployment" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.324077 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="08ee5bdf-bc91-4f34-8459-bc65419f93d7" containerName="ssh-known-hosts-edpm-deployment" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.324227 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="08ee5bdf-bc91-4f34-8459-bc65419f93d7" containerName="ssh-known-hosts-edpm-deployment" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.324827 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.329496 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.329755 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.329899 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.330160 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.330278 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.347313 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn"] Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.405364 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.405411 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.405441 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c65cz\" (UniqueName: \"kubernetes.io/projected/617e6a58-e676-42e3-a897-939d9072d030-kube-api-access-c65cz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.405470 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.447013 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:09:51 crc kubenswrapper[4773]: E0120 19:09:51.447395 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.507048 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.507110 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.507141 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c65cz\" (UniqueName: \"kubernetes.io/projected/617e6a58-e676-42e3-a897-939d9072d030-kube-api-access-c65cz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.507196 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.511883 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.512697 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ceph\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.513792 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.532578 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c65cz\" (UniqueName: \"kubernetes.io/projected/617e6a58-e676-42e3-a897-939d9072d030-kube-api-access-c65cz\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-ppqxn\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:51 crc kubenswrapper[4773]: I0120 19:09:51.642771 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:09:52 crc kubenswrapper[4773]: I0120 19:09:52.138089 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn"] Jan 20 19:09:52 crc kubenswrapper[4773]: I0120 19:09:52.258116 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" event={"ID":"617e6a58-e676-42e3-a897-939d9072d030","Type":"ContainerStarted","Data":"950b86a0c5c959fdced87b6cf414f0dc68a82da526f0be6060c93be569f7e0bc"} Jan 20 19:09:53 crc kubenswrapper[4773]: I0120 19:09:53.278492 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" event={"ID":"617e6a58-e676-42e3-a897-939d9072d030","Type":"ContainerStarted","Data":"9311d432c3ea715b7c3cf0d32b5476afd429ad99321b459a1e1b66c47e31fd6e"} Jan 20 19:09:53 crc kubenswrapper[4773]: I0120 19:09:53.298799 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" podStartSLOduration=1.860882224 podStartE2EDuration="2.298778642s" podCreationTimestamp="2026-01-20 19:09:51 +0000 UTC" firstStartedPulling="2026-01-20 19:09:52.143995855 +0000 UTC m=+2385.065808879" lastFinishedPulling="2026-01-20 19:09:52.581892273 +0000 UTC m=+2385.503705297" observedRunningTime="2026-01-20 19:09:53.296920738 +0000 UTC m=+2386.218733782" watchObservedRunningTime="2026-01-20 19:09:53.298778642 +0000 UTC m=+2386.220591686" Jan 20 19:10:00 crc kubenswrapper[4773]: I0120 19:10:00.330797 4773 generic.go:334] "Generic (PLEG): container finished" podID="617e6a58-e676-42e3-a897-939d9072d030" containerID="9311d432c3ea715b7c3cf0d32b5476afd429ad99321b459a1e1b66c47e31fd6e" exitCode=0 Jan 20 19:10:00 crc kubenswrapper[4773]: I0120 19:10:00.330883 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" event={"ID":"617e6a58-e676-42e3-a897-939d9072d030","Type":"ContainerDied","Data":"9311d432c3ea715b7c3cf0d32b5476afd429ad99321b459a1e1b66c47e31fd6e"} Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.726503 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.900770 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ceph\") pod \"617e6a58-e676-42e3-a897-939d9072d030\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.900849 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-inventory\") pod \"617e6a58-e676-42e3-a897-939d9072d030\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.901005 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ssh-key-openstack-edpm-ipam\") pod \"617e6a58-e676-42e3-a897-939d9072d030\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.901972 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c65cz\" (UniqueName: \"kubernetes.io/projected/617e6a58-e676-42e3-a897-939d9072d030-kube-api-access-c65cz\") pod \"617e6a58-e676-42e3-a897-939d9072d030\" (UID: \"617e6a58-e676-42e3-a897-939d9072d030\") " Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.907093 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ceph" (OuterVolumeSpecName: "ceph") pod "617e6a58-e676-42e3-a897-939d9072d030" (UID: "617e6a58-e676-42e3-a897-939d9072d030"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.907156 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/617e6a58-e676-42e3-a897-939d9072d030-kube-api-access-c65cz" (OuterVolumeSpecName: "kube-api-access-c65cz") pod "617e6a58-e676-42e3-a897-939d9072d030" (UID: "617e6a58-e676-42e3-a897-939d9072d030"). InnerVolumeSpecName "kube-api-access-c65cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.926436 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "617e6a58-e676-42e3-a897-939d9072d030" (UID: "617e6a58-e676-42e3-a897-939d9072d030"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:01 crc kubenswrapper[4773]: I0120 19:10:01.937109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-inventory" (OuterVolumeSpecName: "inventory") pod "617e6a58-e676-42e3-a897-939d9072d030" (UID: "617e6a58-e676-42e3-a897-939d9072d030"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.004437 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.004471 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.004483 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/617e6a58-e676-42e3-a897-939d9072d030-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.004494 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c65cz\" (UniqueName: \"kubernetes.io/projected/617e6a58-e676-42e3-a897-939d9072d030-kube-api-access-c65cz\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.347073 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" event={"ID":"617e6a58-e676-42e3-a897-939d9072d030","Type":"ContainerDied","Data":"950b86a0c5c959fdced87b6cf414f0dc68a82da526f0be6060c93be569f7e0bc"} Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.347119 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="950b86a0c5c959fdced87b6cf414f0dc68a82da526f0be6060c93be569f7e0bc" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.347150 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-ppqxn" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.423830 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst"] Jan 20 19:10:02 crc kubenswrapper[4773]: E0120 19:10:02.424260 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="617e6a58-e676-42e3-a897-939d9072d030" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.424281 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="617e6a58-e676-42e3-a897-939d9072d030" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.424451 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="617e6a58-e676-42e3-a897-939d9072d030" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.425024 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.426979 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.427005 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.427529 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.427657 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.427732 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.444619 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst"] Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.617713 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.618079 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6mvt\" (UniqueName: \"kubernetes.io/projected/f6ebb133-0720-46a2-9da3-ec9dc396266b-kube-api-access-w6mvt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.618159 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.618892 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.720415 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.720497 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.720583 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.720605 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6mvt\" (UniqueName: \"kubernetes.io/projected/f6ebb133-0720-46a2-9da3-ec9dc396266b-kube-api-access-w6mvt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.726897 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ceph\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.728463 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.737037 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.739817 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6mvt\" (UniqueName: \"kubernetes.io/projected/f6ebb133-0720-46a2-9da3-ec9dc396266b-kube-api-access-w6mvt\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:02 crc kubenswrapper[4773]: I0120 19:10:02.743776 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:03 crc kubenswrapper[4773]: I0120 19:10:03.059680 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst"] Jan 20 19:10:03 crc kubenswrapper[4773]: I0120 19:10:03.356290 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" event={"ID":"f6ebb133-0720-46a2-9da3-ec9dc396266b","Type":"ContainerStarted","Data":"5bbd0e74acf506ed6622468cbf7e5b29c4130fa256991e1e438571cadcfc1e49"} Jan 20 19:10:03 crc kubenswrapper[4773]: I0120 19:10:03.448436 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:10:03 crc kubenswrapper[4773]: E0120 19:10:03.449198 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:10:04 crc kubenswrapper[4773]: I0120 19:10:04.364089 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" event={"ID":"f6ebb133-0720-46a2-9da3-ec9dc396266b","Type":"ContainerStarted","Data":"4ba0ca9d5878b3eca52770560fb6bc15d84a62adc1124ab29080ea9ae4ab4ba1"} Jan 20 19:10:04 crc kubenswrapper[4773]: I0120 19:10:04.384421 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" podStartSLOduration=1.900983171 podStartE2EDuration="2.384400893s" podCreationTimestamp="2026-01-20 19:10:02 +0000 UTC" firstStartedPulling="2026-01-20 19:10:03.063208064 +0000 UTC m=+2395.985021088" lastFinishedPulling="2026-01-20 19:10:03.546625786 +0000 UTC m=+2396.468438810" observedRunningTime="2026-01-20 19:10:04.382239291 +0000 UTC m=+2397.304052315" watchObservedRunningTime="2026-01-20 19:10:04.384400893 +0000 UTC m=+2397.306213917" Jan 20 19:10:13 crc kubenswrapper[4773]: I0120 19:10:13.436345 4773 generic.go:334] "Generic (PLEG): container finished" podID="f6ebb133-0720-46a2-9da3-ec9dc396266b" containerID="4ba0ca9d5878b3eca52770560fb6bc15d84a62adc1124ab29080ea9ae4ab4ba1" exitCode=0 Jan 20 19:10:13 crc kubenswrapper[4773]: I0120 19:10:13.436420 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" event={"ID":"f6ebb133-0720-46a2-9da3-ec9dc396266b","Type":"ContainerDied","Data":"4ba0ca9d5878b3eca52770560fb6bc15d84a62adc1124ab29080ea9ae4ab4ba1"} Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.843615 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.938591 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ceph\") pod \"f6ebb133-0720-46a2-9da3-ec9dc396266b\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.938721 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ssh-key-openstack-edpm-ipam\") pod \"f6ebb133-0720-46a2-9da3-ec9dc396266b\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.938842 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w6mvt\" (UniqueName: \"kubernetes.io/projected/f6ebb133-0720-46a2-9da3-ec9dc396266b-kube-api-access-w6mvt\") pod \"f6ebb133-0720-46a2-9da3-ec9dc396266b\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.938977 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-inventory\") pod \"f6ebb133-0720-46a2-9da3-ec9dc396266b\" (UID: \"f6ebb133-0720-46a2-9da3-ec9dc396266b\") " Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.945184 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ceph" (OuterVolumeSpecName: "ceph") pod "f6ebb133-0720-46a2-9da3-ec9dc396266b" (UID: "f6ebb133-0720-46a2-9da3-ec9dc396266b"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.949121 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6ebb133-0720-46a2-9da3-ec9dc396266b-kube-api-access-w6mvt" (OuterVolumeSpecName: "kube-api-access-w6mvt") pod "f6ebb133-0720-46a2-9da3-ec9dc396266b" (UID: "f6ebb133-0720-46a2-9da3-ec9dc396266b"). InnerVolumeSpecName "kube-api-access-w6mvt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.963250 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f6ebb133-0720-46a2-9da3-ec9dc396266b" (UID: "f6ebb133-0720-46a2-9da3-ec9dc396266b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:14 crc kubenswrapper[4773]: I0120 19:10:14.965378 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-inventory" (OuterVolumeSpecName: "inventory") pod "f6ebb133-0720-46a2-9da3-ec9dc396266b" (UID: "f6ebb133-0720-46a2-9da3-ec9dc396266b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.040203 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.040244 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w6mvt\" (UniqueName: \"kubernetes.io/projected/f6ebb133-0720-46a2-9da3-ec9dc396266b-kube-api-access-w6mvt\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.040257 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.040268 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f6ebb133-0720-46a2-9da3-ec9dc396266b-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.447179 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:10:15 crc kubenswrapper[4773]: E0120 19:10:15.447512 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.456620 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.457387 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst" event={"ID":"f6ebb133-0720-46a2-9da3-ec9dc396266b","Type":"ContainerDied","Data":"5bbd0e74acf506ed6622468cbf7e5b29c4130fa256991e1e438571cadcfc1e49"} Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.457431 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5bbd0e74acf506ed6622468cbf7e5b29c4130fa256991e1e438571cadcfc1e49" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.529618 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv"] Jan 20 19:10:15 crc kubenswrapper[4773]: E0120 19:10:15.530098 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6ebb133-0720-46a2-9da3-ec9dc396266b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.530124 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6ebb133-0720-46a2-9da3-ec9dc396266b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.530334 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6ebb133-0720-46a2-9da3-ec9dc396266b" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.531371 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.536655 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.536791 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.536867 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.537154 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.537301 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.537340 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.537777 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.542216 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.545955 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv"] Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.549209 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.549784 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.549851 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.549894 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.549956 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550037 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550067 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550103 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlzzc\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-kube-api-access-rlzzc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550135 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550187 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550441 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550478 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.550537 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651016 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651312 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651337 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651370 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651396 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651427 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651461 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651478 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651501 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651554 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651576 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rlzzc\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-kube-api-access-rlzzc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.651595 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.656261 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.656441 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.658430 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.658719 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.658763 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.658906 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.659673 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.659870 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.660723 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ceph\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.660993 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.667609 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.668713 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.671708 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlzzc\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-kube-api-access-rlzzc\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-lshfv\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:15 crc kubenswrapper[4773]: I0120 19:10:15.882379 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:16 crc kubenswrapper[4773]: I0120 19:10:16.387709 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv"] Jan 20 19:10:16 crc kubenswrapper[4773]: W0120 19:10:16.391287 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda459169f_671f_4dd7_96d3_019d59bd14c6.slice/crio-952d6a53b5864fcd8e00aada5b3728e885a330cc4445aec7a3d1e6170b3aaa83 WatchSource:0}: Error finding container 952d6a53b5864fcd8e00aada5b3728e885a330cc4445aec7a3d1e6170b3aaa83: Status 404 returned error can't find the container with id 952d6a53b5864fcd8e00aada5b3728e885a330cc4445aec7a3d1e6170b3aaa83 Jan 20 19:10:16 crc kubenswrapper[4773]: I0120 19:10:16.468154 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" event={"ID":"a459169f-671f-4dd7-96d3-019d59bd14c6","Type":"ContainerStarted","Data":"952d6a53b5864fcd8e00aada5b3728e885a330cc4445aec7a3d1e6170b3aaa83"} Jan 20 19:10:17 crc kubenswrapper[4773]: I0120 19:10:17.480335 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" event={"ID":"a459169f-671f-4dd7-96d3-019d59bd14c6","Type":"ContainerStarted","Data":"641986abc5e040590e46fd96a9229415224b670542d1c97de29d2e30b83e09b4"} Jan 20 19:10:17 crc kubenswrapper[4773]: I0120 19:10:17.507781 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" podStartSLOduration=1.964709439 podStartE2EDuration="2.507761177s" podCreationTimestamp="2026-01-20 19:10:15 +0000 UTC" firstStartedPulling="2026-01-20 19:10:16.394515556 +0000 UTC m=+2409.316328580" lastFinishedPulling="2026-01-20 19:10:16.937567294 +0000 UTC m=+2409.859380318" observedRunningTime="2026-01-20 19:10:17.500893261 +0000 UTC m=+2410.422706285" watchObservedRunningTime="2026-01-20 19:10:17.507761177 +0000 UTC m=+2410.429574201" Jan 20 19:10:28 crc kubenswrapper[4773]: I0120 19:10:28.447720 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:10:28 crc kubenswrapper[4773]: E0120 19:10:28.448614 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:10:42 crc kubenswrapper[4773]: I0120 19:10:42.447263 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:10:42 crc kubenswrapper[4773]: E0120 19:10:42.448188 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:10:45 crc kubenswrapper[4773]: I0120 19:10:45.709174 4773 generic.go:334] "Generic (PLEG): container finished" podID="a459169f-671f-4dd7-96d3-019d59bd14c6" containerID="641986abc5e040590e46fd96a9229415224b670542d1c97de29d2e30b83e09b4" exitCode=0 Jan 20 19:10:45 crc kubenswrapper[4773]: I0120 19:10:45.709271 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" event={"ID":"a459169f-671f-4dd7-96d3-019d59bd14c6","Type":"ContainerDied","Data":"641986abc5e040590e46fd96a9229415224b670542d1c97de29d2e30b83e09b4"} Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.156565 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341124 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlzzc\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-kube-api-access-rlzzc\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341165 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-ovn-default-certs-0\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341197 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341227 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-nova-combined-ca-bundle\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341247 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-libvirt-combined-ca-bundle\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341272 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-inventory\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341303 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-bootstrap-combined-ca-bundle\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341333 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ssh-key-openstack-edpm-ipam\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341355 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ovn-combined-ca-bundle\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341375 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-repo-setup-combined-ca-bundle\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341394 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ceph\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341416 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-neutron-metadata-combined-ca-bundle\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.341461 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"a459169f-671f-4dd7-96d3-019d59bd14c6\" (UID: \"a459169f-671f-4dd7-96d3-019d59bd14c6\") " Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.347768 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.347997 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.348130 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-kube-api-access-rlzzc" (OuterVolumeSpecName: "kube-api-access-rlzzc") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "kube-api-access-rlzzc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.348426 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.350582 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.350740 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.351168 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.352513 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.359224 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.381711 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ceph" (OuterVolumeSpecName: "ceph") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.387817 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.389819 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-inventory" (OuterVolumeSpecName: "inventory") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.395757 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "a459169f-671f-4dd7-96d3-019d59bd14c6" (UID: "a459169f-671f-4dd7-96d3-019d59bd14c6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444036 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444072 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rlzzc\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-kube-api-access-rlzzc\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444086 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444122 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/a459169f-671f-4dd7-96d3-019d59bd14c6-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444137 4773 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444151 4773 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444162 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444173 4773 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444184 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444195 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444206 4773 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444218 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.444229 4773 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a459169f-671f-4dd7-96d3-019d59bd14c6-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.727148 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" event={"ID":"a459169f-671f-4dd7-96d3-019d59bd14c6","Type":"ContainerDied","Data":"952d6a53b5864fcd8e00aada5b3728e885a330cc4445aec7a3d1e6170b3aaa83"} Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.727191 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="952d6a53b5864fcd8e00aada5b3728e885a330cc4445aec7a3d1e6170b3aaa83" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.727205 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-lshfv" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.816763 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx"] Jan 20 19:10:47 crc kubenswrapper[4773]: E0120 19:10:47.817147 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a459169f-671f-4dd7-96d3-019d59bd14c6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.817166 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a459169f-671f-4dd7-96d3-019d59bd14c6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.817349 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a459169f-671f-4dd7-96d3-019d59bd14c6" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.818117 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.820291 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.820643 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.821099 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.821174 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.822465 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.835924 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx"] Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.850403 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j22gc\" (UniqueName: \"kubernetes.io/projected/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-kube-api-access-j22gc\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.850465 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.850530 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.850578 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.951873 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.952007 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j22gc\" (UniqueName: \"kubernetes.io/projected/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-kube-api-access-j22gc\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.952037 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.952073 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.957768 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ssh-key-openstack-edpm-ipam\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.962001 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ceph\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.962606 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-inventory\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:47 crc kubenswrapper[4773]: I0120 19:10:47.968361 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j22gc\" (UniqueName: \"kubernetes.io/projected/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-kube-api-access-j22gc\") pod \"ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:48 crc kubenswrapper[4773]: I0120 19:10:48.150321 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:48 crc kubenswrapper[4773]: I0120 19:10:48.704997 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx"] Jan 20 19:10:48 crc kubenswrapper[4773]: I0120 19:10:48.742069 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" event={"ID":"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d","Type":"ContainerStarted","Data":"c814e9cc31c971fe868d9d1571c8e81a1d8cd2b4b2d3a60ea51aa2560b8b1cf5"} Jan 20 19:10:49 crc kubenswrapper[4773]: I0120 19:10:49.752623 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" event={"ID":"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d","Type":"ContainerStarted","Data":"5cc56a4f991529db2099d60965ba72e6b1041ff2289f0d4acdfaf630f922601c"} Jan 20 19:10:54 crc kubenswrapper[4773]: I0120 19:10:54.447302 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:10:54 crc kubenswrapper[4773]: E0120 19:10:54.448400 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:10:54 crc kubenswrapper[4773]: I0120 19:10:54.789295 4773 generic.go:334] "Generic (PLEG): container finished" podID="2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" containerID="5cc56a4f991529db2099d60965ba72e6b1041ff2289f0d4acdfaf630f922601c" exitCode=0 Jan 20 19:10:54 crc kubenswrapper[4773]: I0120 19:10:54.789334 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" event={"ID":"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d","Type":"ContainerDied","Data":"5cc56a4f991529db2099d60965ba72e6b1041ff2289f0d4acdfaf630f922601c"} Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.204828 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.394610 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j22gc\" (UniqueName: \"kubernetes.io/projected/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-kube-api-access-j22gc\") pod \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.394657 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ssh-key-openstack-edpm-ipam\") pod \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.394731 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ceph\") pod \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.394808 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-inventory\") pod \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\" (UID: \"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d\") " Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.400293 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-kube-api-access-j22gc" (OuterVolumeSpecName: "kube-api-access-j22gc") pod "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" (UID: "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d"). InnerVolumeSpecName "kube-api-access-j22gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.410467 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ceph" (OuterVolumeSpecName: "ceph") pod "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" (UID: "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.420497 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-inventory" (OuterVolumeSpecName: "inventory") pod "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" (UID: "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.421067 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" (UID: "2cefaa80-8ba4-4e73-81e3-927c47cc2a5d"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.497193 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.497454 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j22gc\" (UniqueName: \"kubernetes.io/projected/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-kube-api-access-j22gc\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.497477 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.497489 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/2cefaa80-8ba4-4e73-81e3-927c47cc2a5d-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.811810 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" event={"ID":"2cefaa80-8ba4-4e73-81e3-927c47cc2a5d","Type":"ContainerDied","Data":"c814e9cc31c971fe868d9d1571c8e81a1d8cd2b4b2d3a60ea51aa2560b8b1cf5"} Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.811850 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c814e9cc31c971fe868d9d1571c8e81a1d8cd2b4b2d3a60ea51aa2560b8b1cf5" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.811863 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.891093 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn"] Jan 20 19:10:56 crc kubenswrapper[4773]: E0120 19:10:56.891487 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.891507 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.891691 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cefaa80-8ba4-4e73-81e3-927c47cc2a5d" containerName="ceph-client-edpm-deployment-openstack-edpm-ipam" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.892288 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.894362 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.894640 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.899327 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.899387 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.899444 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.899594 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:10:56 crc kubenswrapper[4773]: I0120 19:10:56.902764 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn"] Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.006684 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7xgh\" (UniqueName: \"kubernetes.io/projected/de805082-3188-4adb-9607-4ec5535de661-kube-api-access-x7xgh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.006790 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.006850 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de805082-3188-4adb-9607-4ec5535de661-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.006887 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.006945 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.006985 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.108894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.108983 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de805082-3188-4adb-9607-4ec5535de661-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.109011 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.109056 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.109087 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.109154 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7xgh\" (UniqueName: \"kubernetes.io/projected/de805082-3188-4adb-9607-4ec5535de661-kube-api-access-x7xgh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.150586 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.151441 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de805082-3188-4adb-9607-4ec5535de661-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.151611 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.154343 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ceph\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.154664 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.166714 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7xgh\" (UniqueName: \"kubernetes.io/projected/de805082-3188-4adb-9607-4ec5535de661-kube-api-access-x7xgh\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-cjrtn\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.210166 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.728104 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn"] Jan 20 19:10:57 crc kubenswrapper[4773]: I0120 19:10:57.819664 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" event={"ID":"de805082-3188-4adb-9607-4ec5535de661","Type":"ContainerStarted","Data":"551f1ac8c3b91aad90ff1c0057d02892ecb1206cb70a658c8ed2f174e0f3dab2"} Jan 20 19:10:58 crc kubenswrapper[4773]: I0120 19:10:58.828066 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" event={"ID":"de805082-3188-4adb-9607-4ec5535de661","Type":"ContainerStarted","Data":"804863e04e23be2fcc7a7aa70b2994554d77c7c80c76f0277374cec5743318f6"} Jan 20 19:10:58 crc kubenswrapper[4773]: I0120 19:10:58.855394 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" podStartSLOduration=2.30635961 podStartE2EDuration="2.855375581s" podCreationTimestamp="2026-01-20 19:10:56 +0000 UTC" firstStartedPulling="2026-01-20 19:10:57.733498309 +0000 UTC m=+2450.655311333" lastFinishedPulling="2026-01-20 19:10:58.28251428 +0000 UTC m=+2451.204327304" observedRunningTime="2026-01-20 19:10:58.84868897 +0000 UTC m=+2451.770501994" watchObservedRunningTime="2026-01-20 19:10:58.855375581 +0000 UTC m=+2451.777188615" Jan 20 19:11:05 crc kubenswrapper[4773]: I0120 19:11:05.448113 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:11:05 crc kubenswrapper[4773]: E0120 19:11:05.448888 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:11:20 crc kubenswrapper[4773]: I0120 19:11:20.446752 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:11:20 crc kubenswrapper[4773]: E0120 19:11:20.448594 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:11:34 crc kubenswrapper[4773]: I0120 19:11:34.447827 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:11:34 crc kubenswrapper[4773]: E0120 19:11:34.448714 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:11:46 crc kubenswrapper[4773]: I0120 19:11:46.447634 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:11:46 crc kubenswrapper[4773]: E0120 19:11:46.449068 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:11:57 crc kubenswrapper[4773]: I0120 19:11:57.465042 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:11:57 crc kubenswrapper[4773]: E0120 19:11:57.465908 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:12:04 crc kubenswrapper[4773]: I0120 19:12:04.345231 4773 generic.go:334] "Generic (PLEG): container finished" podID="de805082-3188-4adb-9607-4ec5535de661" containerID="804863e04e23be2fcc7a7aa70b2994554d77c7c80c76f0277374cec5743318f6" exitCode=0 Jan 20 19:12:04 crc kubenswrapper[4773]: I0120 19:12:04.345288 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" event={"ID":"de805082-3188-4adb-9607-4ec5535de661","Type":"ContainerDied","Data":"804863e04e23be2fcc7a7aa70b2994554d77c7c80c76f0277374cec5743318f6"} Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.781676 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.934031 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ssh-key-openstack-edpm-ipam\") pod \"de805082-3188-4adb-9607-4ec5535de661\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.934188 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7xgh\" (UniqueName: \"kubernetes.io/projected/de805082-3188-4adb-9607-4ec5535de661-kube-api-access-x7xgh\") pod \"de805082-3188-4adb-9607-4ec5535de661\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.934323 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ceph\") pod \"de805082-3188-4adb-9607-4ec5535de661\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.934355 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-inventory\") pod \"de805082-3188-4adb-9607-4ec5535de661\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.934431 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de805082-3188-4adb-9607-4ec5535de661-ovncontroller-config-0\") pod \"de805082-3188-4adb-9607-4ec5535de661\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.934469 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ovn-combined-ca-bundle\") pod \"de805082-3188-4adb-9607-4ec5535de661\" (UID: \"de805082-3188-4adb-9607-4ec5535de661\") " Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.940038 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "de805082-3188-4adb-9607-4ec5535de661" (UID: "de805082-3188-4adb-9607-4ec5535de661"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.940715 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de805082-3188-4adb-9607-4ec5535de661-kube-api-access-x7xgh" (OuterVolumeSpecName: "kube-api-access-x7xgh") pod "de805082-3188-4adb-9607-4ec5535de661" (UID: "de805082-3188-4adb-9607-4ec5535de661"). InnerVolumeSpecName "kube-api-access-x7xgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.941057 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ceph" (OuterVolumeSpecName: "ceph") pod "de805082-3188-4adb-9607-4ec5535de661" (UID: "de805082-3188-4adb-9607-4ec5535de661"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.961879 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de805082-3188-4adb-9607-4ec5535de661-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "de805082-3188-4adb-9607-4ec5535de661" (UID: "de805082-3188-4adb-9607-4ec5535de661"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.964346 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-inventory" (OuterVolumeSpecName: "inventory") pod "de805082-3188-4adb-9607-4ec5535de661" (UID: "de805082-3188-4adb-9607-4ec5535de661"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:12:05 crc kubenswrapper[4773]: I0120 19:12:05.964469 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "de805082-3188-4adb-9607-4ec5535de661" (UID: "de805082-3188-4adb-9607-4ec5535de661"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.036038 4773 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/de805082-3188-4adb-9607-4ec5535de661-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.036086 4773 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.036099 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.036111 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7xgh\" (UniqueName: \"kubernetes.io/projected/de805082-3188-4adb-9607-4ec5535de661-kube-api-access-x7xgh\") on node \"crc\" DevicePath \"\"" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.036123 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.036140 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/de805082-3188-4adb-9607-4ec5535de661-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.364959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" event={"ID":"de805082-3188-4adb-9607-4ec5535de661","Type":"ContainerDied","Data":"551f1ac8c3b91aad90ff1c0057d02892ecb1206cb70a658c8ed2f174e0f3dab2"} Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.365245 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="551f1ac8c3b91aad90ff1c0057d02892ecb1206cb70a658c8ed2f174e0f3dab2" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.365110 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-cjrtn" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.449470 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc"] Jan 20 19:12:06 crc kubenswrapper[4773]: E0120 19:12:06.449846 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de805082-3188-4adb-9607-4ec5535de661" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.449865 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="de805082-3188-4adb-9607-4ec5535de661" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.450063 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="de805082-3188-4adb-9607-4ec5535de661" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.450787 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.457514 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.457757 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.457806 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.457948 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.458187 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.458221 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.458488 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.461018 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc"] Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.646437 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.646779 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.646860 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.646970 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.647290 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.647507 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgnbp\" (UniqueName: \"kubernetes.io/projected/f735fea9-67a7-4dcc-96f9-8e852df016ce-kube-api-access-sgnbp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.647548 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749093 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgnbp\" (UniqueName: \"kubernetes.io/projected/f735fea9-67a7-4dcc-96f9-8e852df016ce-kube-api-access-sgnbp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749131 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749193 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749230 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749251 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749271 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.749334 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.755058 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.755258 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.755590 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ceph\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.756309 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.761464 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.761615 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:06 crc kubenswrapper[4773]: I0120 19:12:06.767600 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgnbp\" (UniqueName: \"kubernetes.io/projected/f735fea9-67a7-4dcc-96f9-8e852df016ce-kube-api-access-sgnbp\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:07 crc kubenswrapper[4773]: I0120 19:12:07.066625 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:12:07 crc kubenswrapper[4773]: I0120 19:12:07.586274 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc"] Jan 20 19:12:08 crc kubenswrapper[4773]: I0120 19:12:08.047089 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:12:08 crc kubenswrapper[4773]: I0120 19:12:08.380753 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" event={"ID":"f735fea9-67a7-4dcc-96f9-8e852df016ce","Type":"ContainerStarted","Data":"66bea5dbfc8a5f5c64c95a4c86a1563b45902ebd652571952d7fae30fffabeb5"} Jan 20 19:12:08 crc kubenswrapper[4773]: I0120 19:12:08.380794 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" event={"ID":"f735fea9-67a7-4dcc-96f9-8e852df016ce","Type":"ContainerStarted","Data":"a702eed649f35aa329a0e27c986e2002cd7306b161fa775a665fe269809ca0ac"} Jan 20 19:12:12 crc kubenswrapper[4773]: I0120 19:12:12.447202 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:12:12 crc kubenswrapper[4773]: E0120 19:12:12.448067 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:12:24 crc kubenswrapper[4773]: I0120 19:12:24.447141 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:12:24 crc kubenswrapper[4773]: E0120 19:12:24.448111 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:12:35 crc kubenswrapper[4773]: I0120 19:12:35.447524 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:12:36 crc kubenswrapper[4773]: I0120 19:12:36.591882 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"4c0248cb4203d1520e2769ae5f2568a8f29dce42822785658c56fdd8d0c00e9b"} Jan 20 19:12:36 crc kubenswrapper[4773]: I0120 19:12:36.613791 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" podStartSLOduration=30.164066101 podStartE2EDuration="30.613765852s" podCreationTimestamp="2026-01-20 19:12:06 +0000 UTC" firstStartedPulling="2026-01-20 19:12:07.59457243 +0000 UTC m=+2520.516385454" lastFinishedPulling="2026-01-20 19:12:08.044272181 +0000 UTC m=+2520.966085205" observedRunningTime="2026-01-20 19:12:08.403094839 +0000 UTC m=+2521.324907883" watchObservedRunningTime="2026-01-20 19:12:36.613765852 +0000 UTC m=+2549.535578876" Jan 20 19:13:06 crc kubenswrapper[4773]: I0120 19:13:06.836600 4773 generic.go:334] "Generic (PLEG): container finished" podID="f735fea9-67a7-4dcc-96f9-8e852df016ce" containerID="66bea5dbfc8a5f5c64c95a4c86a1563b45902ebd652571952d7fae30fffabeb5" exitCode=0 Jan 20 19:13:06 crc kubenswrapper[4773]: I0120 19:13:06.836671 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" event={"ID":"f735fea9-67a7-4dcc-96f9-8e852df016ce","Type":"ContainerDied","Data":"66bea5dbfc8a5f5c64c95a4c86a1563b45902ebd652571952d7fae30fffabeb5"} Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.246087 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.269768 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-metadata-combined-ca-bundle\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.270155 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgnbp\" (UniqueName: \"kubernetes.io/projected/f735fea9-67a7-4dcc-96f9-8e852df016ce-kube-api-access-sgnbp\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.270206 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-inventory\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.270220 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ssh-key-openstack-edpm-ipam\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.270265 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ceph\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.270289 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-nova-metadata-neutron-config-0\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.270325 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-ovn-metadata-agent-neutron-config-0\") pod \"f735fea9-67a7-4dcc-96f9-8e852df016ce\" (UID: \"f735fea9-67a7-4dcc-96f9-8e852df016ce\") " Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.303690 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f735fea9-67a7-4dcc-96f9-8e852df016ce-kube-api-access-sgnbp" (OuterVolumeSpecName: "kube-api-access-sgnbp") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "kube-api-access-sgnbp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.304308 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.305109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ceph" (OuterVolumeSpecName: "ceph") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.307431 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.318446 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.320591 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.351598 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-inventory" (OuterVolumeSpecName: "inventory") pod "f735fea9-67a7-4dcc-96f9-8e852df016ce" (UID: "f735fea9-67a7-4dcc-96f9-8e852df016ce"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371459 4773 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371495 4773 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371507 4773 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371516 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sgnbp\" (UniqueName: \"kubernetes.io/projected/f735fea9-67a7-4dcc-96f9-8e852df016ce-kube-api-access-sgnbp\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371526 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371534 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.371544 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/f735fea9-67a7-4dcc-96f9-8e852df016ce-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.853323 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" event={"ID":"f735fea9-67a7-4dcc-96f9-8e852df016ce","Type":"ContainerDied","Data":"a702eed649f35aa329a0e27c986e2002cd7306b161fa775a665fe269809ca0ac"} Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.853670 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a702eed649f35aa329a0e27c986e2002cd7306b161fa775a665fe269809ca0ac" Jan 20 19:13:08 crc kubenswrapper[4773]: I0120 19:13:08.853394 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.025917 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9"] Jan 20 19:13:09 crc kubenswrapper[4773]: E0120 19:13:09.026307 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f735fea9-67a7-4dcc-96f9-8e852df016ce" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.026332 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f735fea9-67a7-4dcc-96f9-8e852df016ce" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.026541 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f735fea9-67a7-4dcc-96f9-8e852df016ce" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.027087 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.030523 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.030847 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.031116 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.031215 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.031217 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.031598 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.050524 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9"] Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.081706 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.081769 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.081905 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.081985 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mr5v\" (UniqueName: \"kubernetes.io/projected/5e1b8272-3f37-405c-9f7c-acc1dd855d60-kube-api-access-9mr5v\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.082556 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.082637 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.183616 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.184203 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mr5v\" (UniqueName: \"kubernetes.io/projected/5e1b8272-3f37-405c-9f7c-acc1dd855d60-kube-api-access-9mr5v\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.184313 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.184334 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.184472 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.184510 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.187276 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.187305 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.187541 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.188089 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.194508 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ceph\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.201601 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mr5v\" (UniqueName: \"kubernetes.io/projected/5e1b8272-3f37-405c-9f7c-acc1dd855d60-kube-api-access-9mr5v\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.348108 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:13:09 crc kubenswrapper[4773]: I0120 19:13:09.909219 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9"] Jan 20 19:13:10 crc kubenswrapper[4773]: I0120 19:13:10.868687 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" event={"ID":"5e1b8272-3f37-405c-9f7c-acc1dd855d60","Type":"ContainerStarted","Data":"e53a39dda91cd4424e1be373936eb9332c7149e3a73c3d2efb4770fc0d55b04a"} Jan 20 19:13:10 crc kubenswrapper[4773]: I0120 19:13:10.868723 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" event={"ID":"5e1b8272-3f37-405c-9f7c-acc1dd855d60","Type":"ContainerStarted","Data":"4fb455accb7c642045b9830c42a1eef01ec071caf8bd86c446dbea2234b516db"} Jan 20 19:13:10 crc kubenswrapper[4773]: I0120 19:13:10.888902 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" podStartSLOduration=1.218880168 podStartE2EDuration="1.888880877s" podCreationTimestamp="2026-01-20 19:13:09 +0000 UTC" firstStartedPulling="2026-01-20 19:13:09.917340629 +0000 UTC m=+2582.839153653" lastFinishedPulling="2026-01-20 19:13:10.587341338 +0000 UTC m=+2583.509154362" observedRunningTime="2026-01-20 19:13:10.885488006 +0000 UTC m=+2583.807301030" watchObservedRunningTime="2026-01-20 19:13:10.888880877 +0000 UTC m=+2583.810693901" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.417807 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-2n25g"] Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.420371 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.428835 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2n25g"] Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.609497 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-catalog-content\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.609575 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfst6\" (UniqueName: \"kubernetes.io/projected/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-kube-api-access-qfst6\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.609602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-utilities\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.711078 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-catalog-content\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.711446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfst6\" (UniqueName: \"kubernetes.io/projected/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-kube-api-access-qfst6\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.711468 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-utilities\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.711763 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-catalog-content\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.712010 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-utilities\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.748408 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfst6\" (UniqueName: \"kubernetes.io/projected/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-kube-api-access-qfst6\") pod \"redhat-operators-2n25g\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:21 crc kubenswrapper[4773]: I0120 19:14:21.748910 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:22 crc kubenswrapper[4773]: I0120 19:14:22.195479 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-2n25g"] Jan 20 19:14:22 crc kubenswrapper[4773]: I0120 19:14:22.440784 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerID="a531a68f8a47f521071858f8fdf48b48feb17e1093d2e3d590d7bfe22435deb4" exitCode=0 Jan 20 19:14:22 crc kubenswrapper[4773]: I0120 19:14:22.440835 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2n25g" event={"ID":"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c","Type":"ContainerDied","Data":"a531a68f8a47f521071858f8fdf48b48feb17e1093d2e3d590d7bfe22435deb4"} Jan 20 19:14:22 crc kubenswrapper[4773]: I0120 19:14:22.440904 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2n25g" event={"ID":"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c","Type":"ContainerStarted","Data":"fbf9aeb5648d8d46e58a5c7ca6f6106654eda90c176d1321e2b4db729aa24b85"} Jan 20 19:14:22 crc kubenswrapper[4773]: E0120 19:14:22.501577 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9dc5be9_7ed4_4735_8fa8_42cdd10b8a9c.slice/crio-conmon-a531a68f8a47f521071858f8fdf48b48feb17e1093d2e3d590d7bfe22435deb4.scope\": RecentStats: unable to find data in memory cache]" Jan 20 19:14:24 crc kubenswrapper[4773]: I0120 19:14:24.468731 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerID="1b51815bed658450c0ab9229613f8d80ebce730e02b6a3417d5216ee6bd4c737" exitCode=0 Jan 20 19:14:24 crc kubenswrapper[4773]: I0120 19:14:24.469205 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2n25g" event={"ID":"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c","Type":"ContainerDied","Data":"1b51815bed658450c0ab9229613f8d80ebce730e02b6a3417d5216ee6bd4c737"} Jan 20 19:14:25 crc kubenswrapper[4773]: I0120 19:14:25.480594 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2n25g" event={"ID":"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c","Type":"ContainerStarted","Data":"a22af1cb0af98d909c4b966d2760296add1a543f238adaa78810d9b7b55b1b7b"} Jan 20 19:14:25 crc kubenswrapper[4773]: I0120 19:14:25.498342 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-2n25g" podStartSLOduration=1.94270562 podStartE2EDuration="4.498320873s" podCreationTimestamp="2026-01-20 19:14:21 +0000 UTC" firstStartedPulling="2026-01-20 19:14:22.44319116 +0000 UTC m=+2655.365004184" lastFinishedPulling="2026-01-20 19:14:24.998806413 +0000 UTC m=+2657.920619437" observedRunningTime="2026-01-20 19:14:25.496354486 +0000 UTC m=+2658.418167510" watchObservedRunningTime="2026-01-20 19:14:25.498320873 +0000 UTC m=+2658.420133897" Jan 20 19:14:31 crc kubenswrapper[4773]: I0120 19:14:31.749477 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:31 crc kubenswrapper[4773]: I0120 19:14:31.749811 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:31 crc kubenswrapper[4773]: I0120 19:14:31.796429 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:32 crc kubenswrapper[4773]: I0120 19:14:32.578686 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:32 crc kubenswrapper[4773]: I0120 19:14:32.864777 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2n25g"] Jan 20 19:14:34 crc kubenswrapper[4773]: I0120 19:14:34.560227 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-2n25g" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="registry-server" containerID="cri-o://a22af1cb0af98d909c4b966d2760296add1a543f238adaa78810d9b7b55b1b7b" gracePeriod=2 Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.588384 4773 generic.go:334] "Generic (PLEG): container finished" podID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerID="a22af1cb0af98d909c4b966d2760296add1a543f238adaa78810d9b7b55b1b7b" exitCode=0 Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.588451 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2n25g" event={"ID":"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c","Type":"ContainerDied","Data":"a22af1cb0af98d909c4b966d2760296add1a543f238adaa78810d9b7b55b1b7b"} Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.676684 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.813051 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-utilities\") pod \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.814224 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-catalog-content\") pod \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.814281 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfst6\" (UniqueName: \"kubernetes.io/projected/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-kube-api-access-qfst6\") pod \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\" (UID: \"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c\") " Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.814859 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-utilities" (OuterVolumeSpecName: "utilities") pod "c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" (UID: "c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.832136 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-kube-api-access-qfst6" (OuterVolumeSpecName: "kube-api-access-qfst6") pod "c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" (UID: "c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c"). InnerVolumeSpecName "kube-api-access-qfst6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.916623 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfst6\" (UniqueName: \"kubernetes.io/projected/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-kube-api-access-qfst6\") on node \"crc\" DevicePath \"\"" Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.916956 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:14:37 crc kubenswrapper[4773]: I0120 19:14:37.937388 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" (UID: "c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.018196 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.598252 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-2n25g" event={"ID":"c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c","Type":"ContainerDied","Data":"fbf9aeb5648d8d46e58a5c7ca6f6106654eda90c176d1321e2b4db729aa24b85"} Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.598296 4773 scope.go:117] "RemoveContainer" containerID="a22af1cb0af98d909c4b966d2760296add1a543f238adaa78810d9b7b55b1b7b" Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.598400 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-2n25g" Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.635001 4773 scope.go:117] "RemoveContainer" containerID="1b51815bed658450c0ab9229613f8d80ebce730e02b6a3417d5216ee6bd4c737" Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.639400 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-2n25g"] Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.648143 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-2n25g"] Jan 20 19:14:38 crc kubenswrapper[4773]: I0120 19:14:38.658342 4773 scope.go:117] "RemoveContainer" containerID="a531a68f8a47f521071858f8fdf48b48feb17e1093d2e3d590d7bfe22435deb4" Jan 20 19:14:39 crc kubenswrapper[4773]: I0120 19:14:39.457346 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" path="/var/lib/kubelet/pods/c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c/volumes" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.306847 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-jplxv"] Jan 20 19:14:51 crc kubenswrapper[4773]: E0120 19:14:51.310824 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="registry-server" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.310862 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="registry-server" Jan 20 19:14:51 crc kubenswrapper[4773]: E0120 19:14:51.310888 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="extract-content" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.310896 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="extract-content" Jan 20 19:14:51 crc kubenswrapper[4773]: E0120 19:14:51.310924 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="extract-utilities" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.310950 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="extract-utilities" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.311142 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="c9dc5be9-7ed4-4735-8fa8-42cdd10b8a9c" containerName="registry-server" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.312683 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.316994 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jplxv"] Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.453840 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzhjd\" (UniqueName: \"kubernetes.io/projected/47e6cea6-90a9-46c1-8c9f-c36182604be7-kube-api-access-tzhjd\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.453978 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-catalog-content\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.454037 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-utilities\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.556053 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzhjd\" (UniqueName: \"kubernetes.io/projected/47e6cea6-90a9-46c1-8c9f-c36182604be7-kube-api-access-tzhjd\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.556585 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-catalog-content\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.557025 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-catalog-content\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.557227 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-utilities\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.557686 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-utilities\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.582812 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzhjd\" (UniqueName: \"kubernetes.io/projected/47e6cea6-90a9-46c1-8c9f-c36182604be7-kube-api-access-tzhjd\") pod \"redhat-marketplace-jplxv\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:51 crc kubenswrapper[4773]: I0120 19:14:51.636540 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:14:52 crc kubenswrapper[4773]: I0120 19:14:52.157181 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-jplxv"] Jan 20 19:14:52 crc kubenswrapper[4773]: I0120 19:14:52.719681 4773 generic.go:334] "Generic (PLEG): container finished" podID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerID="895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2" exitCode=0 Jan 20 19:14:52 crc kubenswrapper[4773]: I0120 19:14:52.719728 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jplxv" event={"ID":"47e6cea6-90a9-46c1-8c9f-c36182604be7","Type":"ContainerDied","Data":"895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2"} Jan 20 19:14:52 crc kubenswrapper[4773]: I0120 19:14:52.719761 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jplxv" event={"ID":"47e6cea6-90a9-46c1-8c9f-c36182604be7","Type":"ContainerStarted","Data":"3a9c1c43a90ff7044ed955a46a9a095e67bfd81e8256db923dda0ab6848b0fbf"} Jan 20 19:14:52 crc kubenswrapper[4773]: I0120 19:14:52.721763 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 19:14:53 crc kubenswrapper[4773]: I0120 19:14:53.728840 4773 generic.go:334] "Generic (PLEG): container finished" podID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerID="d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf" exitCode=0 Jan 20 19:14:53 crc kubenswrapper[4773]: I0120 19:14:53.729156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jplxv" event={"ID":"47e6cea6-90a9-46c1-8c9f-c36182604be7","Type":"ContainerDied","Data":"d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf"} Jan 20 19:14:54 crc kubenswrapper[4773]: I0120 19:14:54.738633 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jplxv" event={"ID":"47e6cea6-90a9-46c1-8c9f-c36182604be7","Type":"ContainerStarted","Data":"b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160"} Jan 20 19:14:54 crc kubenswrapper[4773]: I0120 19:14:54.768225 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-jplxv" podStartSLOduration=2.297275949 podStartE2EDuration="3.768205395s" podCreationTimestamp="2026-01-20 19:14:51 +0000 UTC" firstStartedPulling="2026-01-20 19:14:52.721534647 +0000 UTC m=+2685.643347671" lastFinishedPulling="2026-01-20 19:14:54.192464083 +0000 UTC m=+2687.114277117" observedRunningTime="2026-01-20 19:14:54.765301305 +0000 UTC m=+2687.687114329" watchObservedRunningTime="2026-01-20 19:14:54.768205395 +0000 UTC m=+2687.690018419" Jan 20 19:14:58 crc kubenswrapper[4773]: I0120 19:14:58.170426 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:14:58 crc kubenswrapper[4773]: I0120 19:14:58.170982 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.159669 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k"] Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.165016 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.168167 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.168231 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.174708 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k"] Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.232736 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f358c83b-d3f1-4f95-a02b-beecd73c1adb-config-volume\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.232798 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f358c83b-d3f1-4f95-a02b-beecd73c1adb-secret-volume\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.232921 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mj25x\" (UniqueName: \"kubernetes.io/projected/f358c83b-d3f1-4f95-a02b-beecd73c1adb-kube-api-access-mj25x\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.339835 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mj25x\" (UniqueName: \"kubernetes.io/projected/f358c83b-d3f1-4f95-a02b-beecd73c1adb-kube-api-access-mj25x\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.340226 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f358c83b-d3f1-4f95-a02b-beecd73c1adb-config-volume\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.340328 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f358c83b-d3f1-4f95-a02b-beecd73c1adb-secret-volume\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.341741 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f358c83b-d3f1-4f95-a02b-beecd73c1adb-config-volume\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.347716 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f358c83b-d3f1-4f95-a02b-beecd73c1adb-secret-volume\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.357679 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mj25x\" (UniqueName: \"kubernetes.io/projected/f358c83b-d3f1-4f95-a02b-beecd73c1adb-kube-api-access-mj25x\") pod \"collect-profiles-29482275-2vq4k\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.489823 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:00 crc kubenswrapper[4773]: I0120 19:15:00.918876 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k"] Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.636744 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.637040 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.681420 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.801273 4773 generic.go:334] "Generic (PLEG): container finished" podID="f358c83b-d3f1-4f95-a02b-beecd73c1adb" containerID="d2f5f6165c5bd6d2e78266b658c2c0568526cc830bd6d23c38285c251c623a30" exitCode=0 Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.801404 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" event={"ID":"f358c83b-d3f1-4f95-a02b-beecd73c1adb","Type":"ContainerDied","Data":"d2f5f6165c5bd6d2e78266b658c2c0568526cc830bd6d23c38285c251c623a30"} Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.801456 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" event={"ID":"f358c83b-d3f1-4f95-a02b-beecd73c1adb","Type":"ContainerStarted","Data":"e620b8d1f65e369a0b234717864d59f499d718603fcbce9a525bc94a325b2370"} Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.857400 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:15:01 crc kubenswrapper[4773]: I0120 19:15:01.913874 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jplxv"] Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.112969 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.195875 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f358c83b-d3f1-4f95-a02b-beecd73c1adb-secret-volume\") pod \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.196048 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mj25x\" (UniqueName: \"kubernetes.io/projected/f358c83b-d3f1-4f95-a02b-beecd73c1adb-kube-api-access-mj25x\") pod \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.196202 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f358c83b-d3f1-4f95-a02b-beecd73c1adb-config-volume\") pod \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\" (UID: \"f358c83b-d3f1-4f95-a02b-beecd73c1adb\") " Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.196796 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f358c83b-d3f1-4f95-a02b-beecd73c1adb-config-volume" (OuterVolumeSpecName: "config-volume") pod "f358c83b-d3f1-4f95-a02b-beecd73c1adb" (UID: "f358c83b-d3f1-4f95-a02b-beecd73c1adb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.201503 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f358c83b-d3f1-4f95-a02b-beecd73c1adb-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "f358c83b-d3f1-4f95-a02b-beecd73c1adb" (UID: "f358c83b-d3f1-4f95-a02b-beecd73c1adb"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.202000 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f358c83b-d3f1-4f95-a02b-beecd73c1adb-kube-api-access-mj25x" (OuterVolumeSpecName: "kube-api-access-mj25x") pod "f358c83b-d3f1-4f95-a02b-beecd73c1adb" (UID: "f358c83b-d3f1-4f95-a02b-beecd73c1adb"). InnerVolumeSpecName "kube-api-access-mj25x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.298265 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f358c83b-d3f1-4f95-a02b-beecd73c1adb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.298304 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/f358c83b-d3f1-4f95-a02b-beecd73c1adb-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.298316 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mj25x\" (UniqueName: \"kubernetes.io/projected/f358c83b-d3f1-4f95-a02b-beecd73c1adb-kube-api-access-mj25x\") on node \"crc\" DevicePath \"\"" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.817598 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" event={"ID":"f358c83b-d3f1-4f95-a02b-beecd73c1adb","Type":"ContainerDied","Data":"e620b8d1f65e369a0b234717864d59f499d718603fcbce9a525bc94a325b2370"} Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.817964 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e620b8d1f65e369a0b234717864d59f499d718603fcbce9a525bc94a325b2370" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.817613 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482275-2vq4k" Jan 20 19:15:03 crc kubenswrapper[4773]: I0120 19:15:03.817739 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-jplxv" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="registry-server" containerID="cri-o://b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160" gracePeriod=2 Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.188852 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn"] Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.196539 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482230-pqppn"] Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.742365 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.825139 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-catalog-content\") pod \"47e6cea6-90a9-46c1-8c9f-c36182604be7\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.825188 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tzhjd\" (UniqueName: \"kubernetes.io/projected/47e6cea6-90a9-46c1-8c9f-c36182604be7-kube-api-access-tzhjd\") pod \"47e6cea6-90a9-46c1-8c9f-c36182604be7\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.825243 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-utilities\") pod \"47e6cea6-90a9-46c1-8c9f-c36182604be7\" (UID: \"47e6cea6-90a9-46c1-8c9f-c36182604be7\") " Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.826376 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-utilities" (OuterVolumeSpecName: "utilities") pod "47e6cea6-90a9-46c1-8c9f-c36182604be7" (UID: "47e6cea6-90a9-46c1-8c9f-c36182604be7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.827693 4773 generic.go:334] "Generic (PLEG): container finished" podID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerID="b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160" exitCode=0 Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.827732 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jplxv" event={"ID":"47e6cea6-90a9-46c1-8c9f-c36182604be7","Type":"ContainerDied","Data":"b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160"} Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.827757 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-jplxv" event={"ID":"47e6cea6-90a9-46c1-8c9f-c36182604be7","Type":"ContainerDied","Data":"3a9c1c43a90ff7044ed955a46a9a095e67bfd81e8256db923dda0ab6848b0fbf"} Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.827756 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-jplxv" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.827773 4773 scope.go:117] "RemoveContainer" containerID="b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.832324 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/47e6cea6-90a9-46c1-8c9f-c36182604be7-kube-api-access-tzhjd" (OuterVolumeSpecName: "kube-api-access-tzhjd") pod "47e6cea6-90a9-46c1-8c9f-c36182604be7" (UID: "47e6cea6-90a9-46c1-8c9f-c36182604be7"). InnerVolumeSpecName "kube-api-access-tzhjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.848453 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "47e6cea6-90a9-46c1-8c9f-c36182604be7" (UID: "47e6cea6-90a9-46c1-8c9f-c36182604be7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.882384 4773 scope.go:117] "RemoveContainer" containerID="d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.902300 4773 scope.go:117] "RemoveContainer" containerID="895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.927901 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.927976 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/47e6cea6-90a9-46c1-8c9f-c36182604be7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.927992 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tzhjd\" (UniqueName: \"kubernetes.io/projected/47e6cea6-90a9-46c1-8c9f-c36182604be7-kube-api-access-tzhjd\") on node \"crc\" DevicePath \"\"" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.949016 4773 scope.go:117] "RemoveContainer" containerID="b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160" Jan 20 19:15:04 crc kubenswrapper[4773]: E0120 19:15:04.949541 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160\": container with ID starting with b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160 not found: ID does not exist" containerID="b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.949635 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160"} err="failed to get container status \"b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160\": rpc error: code = NotFound desc = could not find container \"b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160\": container with ID starting with b6e15baf89e897a98fa9391491e4c268a7261db879d4916166c01801dad52160 not found: ID does not exist" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.949677 4773 scope.go:117] "RemoveContainer" containerID="d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf" Jan 20 19:15:04 crc kubenswrapper[4773]: E0120 19:15:04.950272 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf\": container with ID starting with d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf not found: ID does not exist" containerID="d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.950316 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf"} err="failed to get container status \"d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf\": rpc error: code = NotFound desc = could not find container \"d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf\": container with ID starting with d6124561dd0418da52d101d6dec3f2cf08b97111c25f2a6ba6ae2ea1e5c4ebdf not found: ID does not exist" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.950345 4773 scope.go:117] "RemoveContainer" containerID="895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2" Jan 20 19:15:04 crc kubenswrapper[4773]: E0120 19:15:04.950904 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2\": container with ID starting with 895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2 not found: ID does not exist" containerID="895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2" Jan 20 19:15:04 crc kubenswrapper[4773]: I0120 19:15:04.950974 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2"} err="failed to get container status \"895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2\": rpc error: code = NotFound desc = could not find container \"895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2\": container with ID starting with 895f44c68fbe6703af24225bd96c75b9051c2fbd1f5926cb4944569934501bc2 not found: ID does not exist" Jan 20 19:15:05 crc kubenswrapper[4773]: I0120 19:15:05.165586 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-jplxv"] Jan 20 19:15:05 crc kubenswrapper[4773]: I0120 19:15:05.173848 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-jplxv"] Jan 20 19:15:05 crc kubenswrapper[4773]: I0120 19:15:05.460015 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="007a1e5a-0e90-44d1-b19d-e92154fb6a3d" path="/var/lib/kubelet/pods/007a1e5a-0e90-44d1-b19d-e92154fb6a3d/volumes" Jan 20 19:15:05 crc kubenswrapper[4773]: I0120 19:15:05.460773 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" path="/var/lib/kubelet/pods/47e6cea6-90a9-46c1-8c9f-c36182604be7/volumes" Jan 20 19:15:20 crc kubenswrapper[4773]: I0120 19:15:20.518427 4773 scope.go:117] "RemoveContainer" containerID="b136dfb1f23b27ec2f873c0b6f216725c1fb39f8486e5a0ea6aa300a7bc89cf5" Jan 20 19:15:28 crc kubenswrapper[4773]: I0120 19:15:28.169707 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:15:28 crc kubenswrapper[4773]: I0120 19:15:28.170282 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:15:58 crc kubenswrapper[4773]: I0120 19:15:58.170110 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:15:58 crc kubenswrapper[4773]: I0120 19:15:58.171647 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:15:58 crc kubenswrapper[4773]: I0120 19:15:58.171758 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:15:58 crc kubenswrapper[4773]: I0120 19:15:58.172579 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"4c0248cb4203d1520e2769ae5f2568a8f29dce42822785658c56fdd8d0c00e9b"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:15:58 crc kubenswrapper[4773]: I0120 19:15:58.172712 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://4c0248cb4203d1520e2769ae5f2568a8f29dce42822785658c56fdd8d0c00e9b" gracePeriod=600 Jan 20 19:15:59 crc kubenswrapper[4773]: I0120 19:15:59.276988 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="4c0248cb4203d1520e2769ae5f2568a8f29dce42822785658c56fdd8d0c00e9b" exitCode=0 Jan 20 19:15:59 crc kubenswrapper[4773]: I0120 19:15:59.277065 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"4c0248cb4203d1520e2769ae5f2568a8f29dce42822785658c56fdd8d0c00e9b"} Jan 20 19:15:59 crc kubenswrapper[4773]: I0120 19:15:59.277357 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019"} Jan 20 19:15:59 crc kubenswrapper[4773]: I0120 19:15:59.277385 4773 scope.go:117] "RemoveContainer" containerID="a7dbd0a912d949d6c4eebc0124401dbd7713c051c7fc8f0eefcca46927202be1" Jan 20 19:17:26 crc kubenswrapper[4773]: I0120 19:17:26.557838 4773 generic.go:334] "Generic (PLEG): container finished" podID="5e1b8272-3f37-405c-9f7c-acc1dd855d60" containerID="e53a39dda91cd4424e1be373936eb9332c7149e3a73c3d2efb4770fc0d55b04a" exitCode=0 Jan 20 19:17:26 crc kubenswrapper[4773]: I0120 19:17:26.557980 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" event={"ID":"5e1b8272-3f37-405c-9f7c-acc1dd855d60","Type":"ContainerDied","Data":"e53a39dda91cd4424e1be373936eb9332c7149e3a73c3d2efb4770fc0d55b04a"} Jan 20 19:17:27 crc kubenswrapper[4773]: I0120 19:17:27.951351 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.091029 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-secret-0\") pod \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.091182 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-inventory\") pod \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.091212 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ssh-key-openstack-edpm-ipam\") pod \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.091246 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mr5v\" (UniqueName: \"kubernetes.io/projected/5e1b8272-3f37-405c-9f7c-acc1dd855d60-kube-api-access-9mr5v\") pod \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.091263 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ceph\") pod \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.091308 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-combined-ca-bundle\") pod \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\" (UID: \"5e1b8272-3f37-405c-9f7c-acc1dd855d60\") " Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.096660 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "5e1b8272-3f37-405c-9f7c-acc1dd855d60" (UID: "5e1b8272-3f37-405c-9f7c-acc1dd855d60"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.097400 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ceph" (OuterVolumeSpecName: "ceph") pod "5e1b8272-3f37-405c-9f7c-acc1dd855d60" (UID: "5e1b8272-3f37-405c-9f7c-acc1dd855d60"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.104059 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5e1b8272-3f37-405c-9f7c-acc1dd855d60-kube-api-access-9mr5v" (OuterVolumeSpecName: "kube-api-access-9mr5v") pod "5e1b8272-3f37-405c-9f7c-acc1dd855d60" (UID: "5e1b8272-3f37-405c-9f7c-acc1dd855d60"). InnerVolumeSpecName "kube-api-access-9mr5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.117271 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "5e1b8272-3f37-405c-9f7c-acc1dd855d60" (UID: "5e1b8272-3f37-405c-9f7c-acc1dd855d60"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.126213 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5e1b8272-3f37-405c-9f7c-acc1dd855d60" (UID: "5e1b8272-3f37-405c-9f7c-acc1dd855d60"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.128100 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-inventory" (OuterVolumeSpecName: "inventory") pod "5e1b8272-3f37-405c-9f7c-acc1dd855d60" (UID: "5e1b8272-3f37-405c-9f7c-acc1dd855d60"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.192970 4773 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.193007 4773 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.193017 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.193025 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.193034 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mr5v\" (UniqueName: \"kubernetes.io/projected/5e1b8272-3f37-405c-9f7c-acc1dd855d60-kube-api-access-9mr5v\") on node \"crc\" DevicePath \"\"" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.193045 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/5e1b8272-3f37-405c-9f7c-acc1dd855d60-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.577441 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" event={"ID":"5e1b8272-3f37-405c-9f7c-acc1dd855d60","Type":"ContainerDied","Data":"4fb455accb7c642045b9830c42a1eef01ec071caf8bd86c446dbea2234b516db"} Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.577767 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4fb455accb7c642045b9830c42a1eef01ec071caf8bd86c446dbea2234b516db" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.577540 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.661562 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d"] Jan 20 19:17:28 crc kubenswrapper[4773]: E0120 19:17:28.662005 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="registry-server" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662030 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="registry-server" Jan 20 19:17:28 crc kubenswrapper[4773]: E0120 19:17:28.662055 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="extract-content" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662063 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="extract-content" Jan 20 19:17:28 crc kubenswrapper[4773]: E0120 19:17:28.662077 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f358c83b-d3f1-4f95-a02b-beecd73c1adb" containerName="collect-profiles" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662087 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="f358c83b-d3f1-4f95-a02b-beecd73c1adb" containerName="collect-profiles" Jan 20 19:17:28 crc kubenswrapper[4773]: E0120 19:17:28.662103 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5e1b8272-3f37-405c-9f7c-acc1dd855d60" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662113 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5e1b8272-3f37-405c-9f7c-acc1dd855d60" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 20 19:17:28 crc kubenswrapper[4773]: E0120 19:17:28.662134 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="extract-utilities" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662141 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="extract-utilities" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662346 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="47e6cea6-90a9-46c1-8c9f-c36182604be7" containerName="registry-server" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662390 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="f358c83b-d3f1-4f95-a02b-beecd73c1adb" containerName="collect-profiles" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.662403 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5e1b8272-3f37-405c-9f7c-acc1dd855d60" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.663136 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.670053 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.670375 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.670608 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.670760 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.670974 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.671162 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.671403 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.671563 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ceph-nova" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.671708 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-rzxsv" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.674391 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d"] Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803233 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803278 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsmd9\" (UniqueName: \"kubernetes.io/projected/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-kube-api-access-dsmd9\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803325 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803348 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803367 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803554 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803609 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803634 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.803702 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.804711 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.804800 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906767 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906817 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906853 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906872 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsmd9\" (UniqueName: \"kubernetes.io/projected/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-kube-api-access-dsmd9\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906915 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906949 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906966 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.906998 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.907019 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.907036 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.907061 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.908331 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-extra-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.908609 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph-nova-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.911737 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.912808 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-inventory\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.912900 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.914631 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-1\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.915629 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-custom-ceph-combined-ca-bundle\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.916123 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.916215 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ssh-key-openstack-edpm-ipam\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.916272 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-0\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.925398 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsmd9\" (UniqueName: \"kubernetes.io/projected/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-kube-api-access-dsmd9\") pod \"nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:28 crc kubenswrapper[4773]: I0120 19:17:28.983890 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:17:29 crc kubenswrapper[4773]: I0120 19:17:29.501562 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d"] Jan 20 19:17:29 crc kubenswrapper[4773]: I0120 19:17:29.586308 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" event={"ID":"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6","Type":"ContainerStarted","Data":"53770674c2a213080a854c893c5fad1b4a28a090c6f7a3043311db0e2d863231"} Jan 20 19:17:30 crc kubenswrapper[4773]: I0120 19:17:30.596304 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" event={"ID":"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6","Type":"ContainerStarted","Data":"e84ce546cf91a0456043033e683bd504d9b3778f3cc960ce6ad55eee146736f0"} Jan 20 19:17:30 crc kubenswrapper[4773]: I0120 19:17:30.625048 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" podStartSLOduration=2.111621515 podStartE2EDuration="2.625026049s" podCreationTimestamp="2026-01-20 19:17:28 +0000 UTC" firstStartedPulling="2026-01-20 19:17:29.514383977 +0000 UTC m=+2842.436197021" lastFinishedPulling="2026-01-20 19:17:30.027788531 +0000 UTC m=+2842.949601555" observedRunningTime="2026-01-20 19:17:30.618463541 +0000 UTC m=+2843.540276575" watchObservedRunningTime="2026-01-20 19:17:30.625026049 +0000 UTC m=+2843.546839073" Jan 20 19:17:58 crc kubenswrapper[4773]: I0120 19:17:58.170026 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:17:58 crc kubenswrapper[4773]: I0120 19:17:58.171269 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:18:21 crc kubenswrapper[4773]: I0120 19:18:21.860433 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-q2df5"] Jan 20 19:18:21 crc kubenswrapper[4773]: I0120 19:18:21.863438 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:21 crc kubenswrapper[4773]: I0120 19:18:21.869776 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2df5"] Jan 20 19:18:21 crc kubenswrapper[4773]: I0120 19:18:21.970505 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-utilities\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:21 crc kubenswrapper[4773]: I0120 19:18:21.970902 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tng48\" (UniqueName: \"kubernetes.io/projected/2412fab9-ca39-492f-abb4-14bc806fe535-kube-api-access-tng48\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:21 crc kubenswrapper[4773]: I0120 19:18:21.971010 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-catalog-content\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.073181 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tng48\" (UniqueName: \"kubernetes.io/projected/2412fab9-ca39-492f-abb4-14bc806fe535-kube-api-access-tng48\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.073238 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-catalog-content\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.073272 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-utilities\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.073831 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-catalog-content\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.073873 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-utilities\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.113192 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tng48\" (UniqueName: \"kubernetes.io/projected/2412fab9-ca39-492f-abb4-14bc806fe535-kube-api-access-tng48\") pod \"community-operators-q2df5\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.189713 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:22 crc kubenswrapper[4773]: I0120 19:18:22.746163 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-q2df5"] Jan 20 19:18:23 crc kubenswrapper[4773]: I0120 19:18:23.024255 4773 generic.go:334] "Generic (PLEG): container finished" podID="2412fab9-ca39-492f-abb4-14bc806fe535" containerID="a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f" exitCode=0 Jan 20 19:18:23 crc kubenswrapper[4773]: I0120 19:18:23.024341 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerDied","Data":"a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f"} Jan 20 19:18:23 crc kubenswrapper[4773]: I0120 19:18:23.024600 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerStarted","Data":"d62d5c25c7ac922af447559957ad4ea85cc0573cc46241785ca6c1bbb0f1c947"} Jan 20 19:18:24 crc kubenswrapper[4773]: I0120 19:18:24.033460 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerStarted","Data":"e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e"} Jan 20 19:18:25 crc kubenswrapper[4773]: I0120 19:18:25.043194 4773 generic.go:334] "Generic (PLEG): container finished" podID="2412fab9-ca39-492f-abb4-14bc806fe535" containerID="e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e" exitCode=0 Jan 20 19:18:25 crc kubenswrapper[4773]: I0120 19:18:25.044492 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerDied","Data":"e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e"} Jan 20 19:18:26 crc kubenswrapper[4773]: I0120 19:18:26.052916 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerStarted","Data":"1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305"} Jan 20 19:18:26 crc kubenswrapper[4773]: I0120 19:18:26.074647 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-q2df5" podStartSLOduration=2.572395366 podStartE2EDuration="5.074630596s" podCreationTimestamp="2026-01-20 19:18:21 +0000 UTC" firstStartedPulling="2026-01-20 19:18:23.027516475 +0000 UTC m=+2895.949329499" lastFinishedPulling="2026-01-20 19:18:25.529751705 +0000 UTC m=+2898.451564729" observedRunningTime="2026-01-20 19:18:26.071609913 +0000 UTC m=+2898.993422927" watchObservedRunningTime="2026-01-20 19:18:26.074630596 +0000 UTC m=+2898.996443620" Jan 20 19:18:28 crc kubenswrapper[4773]: I0120 19:18:28.170862 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:18:28 crc kubenswrapper[4773]: I0120 19:18:28.171541 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:18:32 crc kubenswrapper[4773]: I0120 19:18:32.190767 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:32 crc kubenswrapper[4773]: I0120 19:18:32.191230 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:32 crc kubenswrapper[4773]: I0120 19:18:32.238011 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:33 crc kubenswrapper[4773]: I0120 19:18:33.154290 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:33 crc kubenswrapper[4773]: I0120 19:18:33.201074 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2df5"] Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.122900 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-q2df5" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="registry-server" containerID="cri-o://1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305" gracePeriod=2 Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.550979 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.749121 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-utilities\") pod \"2412fab9-ca39-492f-abb4-14bc806fe535\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.749228 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-catalog-content\") pod \"2412fab9-ca39-492f-abb4-14bc806fe535\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.749432 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tng48\" (UniqueName: \"kubernetes.io/projected/2412fab9-ca39-492f-abb4-14bc806fe535-kube-api-access-tng48\") pod \"2412fab9-ca39-492f-abb4-14bc806fe535\" (UID: \"2412fab9-ca39-492f-abb4-14bc806fe535\") " Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.750096 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-utilities" (OuterVolumeSpecName: "utilities") pod "2412fab9-ca39-492f-abb4-14bc806fe535" (UID: "2412fab9-ca39-492f-abb4-14bc806fe535"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.755244 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2412fab9-ca39-492f-abb4-14bc806fe535-kube-api-access-tng48" (OuterVolumeSpecName: "kube-api-access-tng48") pod "2412fab9-ca39-492f-abb4-14bc806fe535" (UID: "2412fab9-ca39-492f-abb4-14bc806fe535"). InnerVolumeSpecName "kube-api-access-tng48". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.799593 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2412fab9-ca39-492f-abb4-14bc806fe535" (UID: "2412fab9-ca39-492f-abb4-14bc806fe535"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.851462 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tng48\" (UniqueName: \"kubernetes.io/projected/2412fab9-ca39-492f-abb4-14bc806fe535-kube-api-access-tng48\") on node \"crc\" DevicePath \"\"" Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.851501 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:18:35 crc kubenswrapper[4773]: I0120 19:18:35.851511 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2412fab9-ca39-492f-abb4-14bc806fe535-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.133710 4773 generic.go:334] "Generic (PLEG): container finished" podID="2412fab9-ca39-492f-abb4-14bc806fe535" containerID="1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305" exitCode=0 Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.133808 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-q2df5" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.133813 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerDied","Data":"1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305"} Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.134249 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-q2df5" event={"ID":"2412fab9-ca39-492f-abb4-14bc806fe535","Type":"ContainerDied","Data":"d62d5c25c7ac922af447559957ad4ea85cc0573cc46241785ca6c1bbb0f1c947"} Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.134269 4773 scope.go:117] "RemoveContainer" containerID="1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.161176 4773 scope.go:117] "RemoveContainer" containerID="e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.173111 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-q2df5"] Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.182572 4773 scope.go:117] "RemoveContainer" containerID="a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.186511 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-q2df5"] Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.231587 4773 scope.go:117] "RemoveContainer" containerID="1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305" Jan 20 19:18:36 crc kubenswrapper[4773]: E0120 19:18:36.232093 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305\": container with ID starting with 1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305 not found: ID does not exist" containerID="1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.232132 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305"} err="failed to get container status \"1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305\": rpc error: code = NotFound desc = could not find container \"1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305\": container with ID starting with 1d0b19062083509610599cd8e89723198cb5996349ac48ea75d947a78342c305 not found: ID does not exist" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.232161 4773 scope.go:117] "RemoveContainer" containerID="e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e" Jan 20 19:18:36 crc kubenswrapper[4773]: E0120 19:18:36.232645 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e\": container with ID starting with e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e not found: ID does not exist" containerID="e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.232728 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e"} err="failed to get container status \"e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e\": rpc error: code = NotFound desc = could not find container \"e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e\": container with ID starting with e6f5d5baefe0f805a74eada5c18bb5d4ba416312416b7f12f30f5dd08ca9321e not found: ID does not exist" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.232786 4773 scope.go:117] "RemoveContainer" containerID="a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f" Jan 20 19:18:36 crc kubenswrapper[4773]: E0120 19:18:36.233345 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f\": container with ID starting with a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f not found: ID does not exist" containerID="a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f" Jan 20 19:18:36 crc kubenswrapper[4773]: I0120 19:18:36.233380 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f"} err="failed to get container status \"a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f\": rpc error: code = NotFound desc = could not find container \"a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f\": container with ID starting with a52ecb12e101b8827aaf070eafdc14405d7ca770d95bb289f822509aa105fd7f not found: ID does not exist" Jan 20 19:18:37 crc kubenswrapper[4773]: I0120 19:18:37.457812 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" path="/var/lib/kubelet/pods/2412fab9-ca39-492f-abb4-14bc806fe535/volumes" Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.170237 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.170919 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.171022 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.171893 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.171987 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" gracePeriod=600 Jan 20 19:18:58 crc kubenswrapper[4773]: E0120 19:18:58.293738 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.329351 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" exitCode=0 Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.329406 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019"} Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.329442 4773 scope.go:117] "RemoveContainer" containerID="4c0248cb4203d1520e2769ae5f2568a8f29dce42822785658c56fdd8d0c00e9b" Jan 20 19:18:58 crc kubenswrapper[4773]: I0120 19:18:58.330232 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:18:58 crc kubenswrapper[4773]: E0120 19:18:58.330580 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:18:58 crc kubenswrapper[4773]: E0120 19:18:58.391137 4773 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddd934f_f012_4083_b5e6_b99711071621.slice/crio-11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ddd934f_f012_4083_b5e6_b99711071621.slice/crio-conmon-11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019.scope\": RecentStats: unable to find data in memory cache]" Jan 20 19:19:09 crc kubenswrapper[4773]: I0120 19:19:09.447545 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:19:09 crc kubenswrapper[4773]: E0120 19:19:09.448377 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:19:23 crc kubenswrapper[4773]: I0120 19:19:23.448555 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:19:23 crc kubenswrapper[4773]: E0120 19:19:23.449910 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:19:36 crc kubenswrapper[4773]: I0120 19:19:36.447544 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:19:36 crc kubenswrapper[4773]: E0120 19:19:36.448392 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:19:51 crc kubenswrapper[4773]: I0120 19:19:51.447492 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:19:51 crc kubenswrapper[4773]: E0120 19:19:51.448148 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:19:54 crc kubenswrapper[4773]: I0120 19:19:54.792843 4773 generic.go:334] "Generic (PLEG): container finished" podID="e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" containerID="e84ce546cf91a0456043033e683bd504d9b3778f3cc960ce6ad55eee146736f0" exitCode=0 Jan 20 19:19:54 crc kubenswrapper[4773]: I0120 19:19:54.792973 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" event={"ID":"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6","Type":"ContainerDied","Data":"e84ce546cf91a0456043033e683bd504d9b3778f3cc960ce6ad55eee146736f0"} Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.188582 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244023 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ssh-key-openstack-edpm-ipam\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244185 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph-nova-0\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244215 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsmd9\" (UniqueName: \"kubernetes.io/projected/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-kube-api-access-dsmd9\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244250 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-custom-ceph-combined-ca-bundle\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244283 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-0\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244360 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-0\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244415 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-1\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244438 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-inventory\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244475 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-1\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244490 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.244532 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-extra-config-0\") pod \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\" (UID: \"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6\") " Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.250456 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-kube-api-access-dsmd9" (OuterVolumeSpecName: "kube-api-access-dsmd9") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "kube-api-access-dsmd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.252069 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-custom-ceph-combined-ca-bundle" (OuterVolumeSpecName: "nova-custom-ceph-combined-ca-bundle") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "nova-custom-ceph-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.272268 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph" (OuterVolumeSpecName: "ceph") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.275157 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.277039 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.277501 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.279097 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-inventory" (OuterVolumeSpecName: "inventory") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.280245 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.283256 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph-nova-0" (OuterVolumeSpecName: "ceph-nova-0") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "ceph-nova-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.285624 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.294577 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" (UID: "e7bfe1d6-9e6c-4964-9cdf-2204156f14c6"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350227 4773 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350583 4773 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-inventory\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350597 4773 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350617 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350628 4773 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350641 4773 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350653 4773 reconciler_common.go:293] "Volume detached for volume \"ceph-nova-0\" (UniqueName: \"kubernetes.io/configmap/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-ceph-nova-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350663 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsmd9\" (UniqueName: \"kubernetes.io/projected/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-kube-api-access-dsmd9\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350673 4773 reconciler_common.go:293] "Volume detached for volume \"nova-custom-ceph-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-custom-ceph-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350684 4773 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.350699 4773 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/e7bfe1d6-9e6c-4964-9cdf-2204156f14c6-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.808191 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" event={"ID":"e7bfe1d6-9e6c-4964-9cdf-2204156f14c6","Type":"ContainerDied","Data":"53770674c2a213080a854c893c5fad1b4a28a090c6f7a3043311db0e2d863231"} Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.808226 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="53770674c2a213080a854c893c5fad1b4a28a090c6f7a3043311db0e2d863231" Jan 20 19:19:56 crc kubenswrapper[4773]: I0120 19:19:56.808254 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d" Jan 20 19:20:05 crc kubenswrapper[4773]: I0120 19:20:05.447681 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:20:05 crc kubenswrapper[4773]: E0120 19:20:05.448513 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.339634 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-backup-0"] Jan 20 19:20:11 crc kubenswrapper[4773]: E0120 19:20:11.341069 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="extract-content" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.341091 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="extract-content" Jan 20 19:20:11 crc kubenswrapper[4773]: E0120 19:20:11.341120 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="extract-utilities" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.341128 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="extract-utilities" Jan 20 19:20:11 crc kubenswrapper[4773]: E0120 19:20:11.341141 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="registry-server" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.341149 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="registry-server" Jan 20 19:20:11 crc kubenswrapper[4773]: E0120 19:20:11.341165 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.341174 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.341384 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="2412fab9-ca39-492f-abb4-14bc806fe535" containerName="registry-server" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.341414 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7bfe1d6-9e6c-4964-9cdf-2204156f14c6" containerName="nova-custom-ceph-edpm-deployment-openstack-edpm-ipam" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.342664 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.345109 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-backup-config-data" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.352335 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.353414 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceph-conf-files" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.355699 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.357846 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-volume-volume1-config-data" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.368392 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.388818 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436038 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436081 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4a053127-e129-429c-9a7b-28e084c34269-ceph\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436104 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436129 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436152 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436173 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436192 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-config-data\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436213 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436230 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436246 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bktj2\" (UniqueName: \"kubernetes.io/projected/4a053127-e129-429c-9a7b-28e084c34269-kube-api-access-bktj2\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436262 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-sys\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436276 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-config-data-custom\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436305 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-run\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436324 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436349 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9bwc\" (UniqueName: \"kubernetes.io/projected/466685e0-d49e-4d97-9436-7db7c10062c3-kube-api-access-g9bwc\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436369 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-dev\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436383 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436414 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436438 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-run\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436453 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-nvme\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436472 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436493 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-lib-modules\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436534 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-scripts\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436548 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436569 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/466685e0-d49e-4d97-9436-7db7c10062c3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436602 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436620 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436643 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.436677 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537742 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537798 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-nvme\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537816 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-run\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537833 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537852 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-lib-modules\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537875 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537894 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537909 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-scripts\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537944 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/466685e0-d49e-4d97-9436-7db7c10062c3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537961 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.537980 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538011 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538032 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538057 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538072 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538094 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538113 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4a053127-e129-429c-9a7b-28e084c34269-ceph\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538128 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538155 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538176 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538203 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538220 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-config-data\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538241 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538256 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538281 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bktj2\" (UniqueName: \"kubernetes.io/projected/4a053127-e129-429c-9a7b-28e084c34269-kube-api-access-bktj2\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538298 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-sys\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538312 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-config-data-custom\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538341 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-run\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538361 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538382 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9bwc\" (UniqueName: \"kubernetes.io/projected/466685e0-d49e-4d97-9436-7db7c10062c3-kube-api-access-g9bwc\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538407 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-dev\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.538422 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.539381 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-iscsi\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.539462 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-nvme\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.539485 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-run\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.539513 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-nvme\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.539547 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-lib-modules\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540073 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-lib-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540112 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-machine-id\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540139 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-etc-iscsi\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-sys\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540183 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-etc-machine-id\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540495 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-locks-cinder\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540509 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-locks-cinder\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540574 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-dev\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540617 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-lib-cinder\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540694 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-var-locks-brick\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540697 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-var-locks-brick\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540832 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-sys\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.540890 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-run\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.541005 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/466685e0-d49e-4d97-9436-7db7c10062c3-lib-modules\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.541265 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/4a053127-e129-429c-9a7b-28e084c34269-dev\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.547667 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-combined-ca-bundle\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.548734 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-scripts\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.559981 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-config-data-custom\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.572496 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/4a053127-e129-429c-9a7b-28e084c34269-ceph\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.578655 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9bwc\" (UniqueName: \"kubernetes.io/projected/466685e0-d49e-4d97-9436-7db7c10062c3-kube-api-access-g9bwc\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.581691 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bktj2\" (UniqueName: \"kubernetes.io/projected/4a053127-e129-429c-9a7b-28e084c34269-kube-api-access-bktj2\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.582298 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/466685e0-d49e-4d97-9436-7db7c10062c3-ceph\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.582729 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-scripts\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.582866 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-combined-ca-bundle\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.583306 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-config-data\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.583311 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/4a053127-e129-429c-9a7b-28e084c34269-config-data-custom\") pod \"cinder-backup-0\" (UID: \"4a053127-e129-429c-9a7b-28e084c34269\") " pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.585081 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/466685e0-d49e-4d97-9436-7db7c10062c3-config-data\") pod \"cinder-volume-volume1-0\" (UID: \"466685e0-d49e-4d97-9436-7db7c10062c3\") " pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.672821 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-backup-0" Jan 20 19:20:11 crc kubenswrapper[4773]: I0120 19:20:11.691869 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.071304 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.074434 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.082217 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.082357 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-9vtkh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.082459 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.082848 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.087063 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.136452 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.138360 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.146486 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.146805 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.149023 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.157422 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-create-5vk9g"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.158564 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.162713 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696e3ee3-25fa-4102-b483-1781d00bb18f-logs\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.162816 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/696e3ee3-25fa-4102-b483-1781d00bb18f-ceph\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.162881 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.162955 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-scripts\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.163000 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.163041 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-config-data\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.163068 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfg77\" (UniqueName: \"kubernetes.io/projected/696e3ee3-25fa-4102-b483-1781d00bb18f-kube-api-access-bfg77\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.163107 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.163137 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/696e3ee3-25fa-4102-b483-1781d00bb18f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.215407 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-5vk9g"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267289 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267361 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-scripts\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267413 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-scripts\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267445 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267494 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267529 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-config-data\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267564 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfltt\" (UniqueName: \"kubernetes.io/projected/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-kube-api-access-sfltt\") pod \"manila-db-create-5vk9g\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267662 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bfg77\" (UniqueName: \"kubernetes.io/projected/696e3ee3-25fa-4102-b483-1781d00bb18f-kube-api-access-bfg77\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267701 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267787 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/696e3ee3-25fa-4102-b483-1781d00bb18f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267871 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-operator-scripts\") pod \"manila-db-create-5vk9g\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267907 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnsdj\" (UniqueName: \"kubernetes.io/projected/117c4f3b-d438-4f73-966c-378c28f67460-kube-api-access-fnsdj\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267968 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/117c4f3b-d438-4f73-966c-378c28f67460-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.267993 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696e3ee3-25fa-4102-b483-1781d00bb18f-logs\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.268018 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.268041 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-config-data\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.268081 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.268103 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/117c4f3b-d438-4f73-966c-378c28f67460-logs\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.268126 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/117c4f3b-d438-4f73-966c-378c28f67460-ceph\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.268161 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/696e3ee3-25fa-4102-b483-1781d00bb18f-ceph\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.269279 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.269774 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-c4ba-account-create-update-47gmh"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.272299 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/696e3ee3-25fa-4102-b483-1781d00bb18f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.272979 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/696e3ee3-25fa-4102-b483-1781d00bb18f-logs\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.275505 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.279511 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-scripts\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.280468 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-db-secret" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.282913 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-c4ba-account-create-update-47gmh"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.284126 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.284353 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/696e3ee3-25fa-4102-b483-1781d00bb18f-ceph\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.288538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.289432 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/696e3ee3-25fa-4102-b483-1781d00bb18f-config-data\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.303080 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfg77\" (UniqueName: \"kubernetes.io/projected/696e3ee3-25fa-4102-b483-1781d00bb18f-kube-api-access-bfg77\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371202 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-operator-scripts\") pod \"manila-db-create-5vk9g\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnsdj\" (UniqueName: \"kubernetes.io/projected/117c4f3b-d438-4f73-966c-378c28f67460-kube-api-access-fnsdj\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371284 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80285eae-2998-47ab-bcd6-e9905e2e71d4-operator-scripts\") pod \"manila-c4ba-account-create-update-47gmh\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371328 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/117c4f3b-d438-4f73-966c-378c28f67460-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371358 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371386 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-config-data\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371421 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371442 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/117c4f3b-d438-4f73-966c-378c28f67460-logs\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371462 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/117c4f3b-d438-4f73-966c-378c28f67460-ceph\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371498 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkg9v\" (UniqueName: \"kubernetes.io/projected/80285eae-2998-47ab-bcd6-e9905e2e71d4-kube-api-access-lkg9v\") pod \"manila-c4ba-account-create-update-47gmh\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371548 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-scripts\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371578 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.371620 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfltt\" (UniqueName: \"kubernetes.io/projected/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-kube-api-access-sfltt\") pod \"manila-db-create-5vk9g\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.372644 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/117c4f3b-d438-4f73-966c-378c28f67460-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.373145 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/117c4f3b-d438-4f73-966c-378c28f67460-logs\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.373153 4773 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") device mount path \"/mnt/openstack/pv12\"" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.379310 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-0\" (UID: \"696e3ee3-25fa-4102-b483-1781d00bb18f\") " pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.383455 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-operator-scripts\") pod \"manila-db-create-5vk9g\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.386453 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-scripts\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.387261 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/117c4f3b-d438-4f73-966c-378c28f67460-ceph\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.387685 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.388534 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfltt\" (UniqueName: \"kubernetes.io/projected/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-kube-api-access-sfltt\") pod \"manila-db-create-5vk9g\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.389127 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-config-data\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.390021 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnsdj\" (UniqueName: \"kubernetes.io/projected/117c4f3b-d438-4f73-966c-378c28f67460-kube-api-access-fnsdj\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.391462 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/117c4f3b-d438-4f73-966c-378c28f67460-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.410815 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.414167 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-internal-api-0\" (UID: \"117c4f3b-d438-4f73-966c-378c28f67460\") " pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.471242 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-backup-0"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.477039 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80285eae-2998-47ab-bcd6-e9905e2e71d4-operator-scripts\") pod \"manila-c4ba-account-create-update-47gmh\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.477204 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lkg9v\" (UniqueName: \"kubernetes.io/projected/80285eae-2998-47ab-bcd6-e9905e2e71d4-kube-api-access-lkg9v\") pod \"manila-c4ba-account-create-update-47gmh\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.478057 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80285eae-2998-47ab-bcd6-e9905e2e71d4-operator-scripts\") pod \"manila-c4ba-account-create-update-47gmh\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.487834 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.497557 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkg9v\" (UniqueName: \"kubernetes.io/projected/80285eae-2998-47ab-bcd6-e9905e2e71d4-kube-api-access-lkg9v\") pod \"manila-c4ba-account-create-update-47gmh\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.500496 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.512387 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.602732 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.653473 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-volume-volume1-0"] Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.967977 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"466685e0-d49e-4d97-9436-7db7c10062c3","Type":"ContainerStarted","Data":"20ce3931ba17bb76a699fb39bbe5e116fea9b64e0577ff1ff4b45da69c5bb50e"} Jan 20 19:20:12 crc kubenswrapper[4773]: I0120 19:20:12.969293 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"4a053127-e129-429c-9a7b-28e084c34269","Type":"ContainerStarted","Data":"64cfe8b2895152a3a7522fd5882e79fbdc6059a51ba538930ce3eb42a4ed6b48"} Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.174630 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.199958 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-create-5vk9g"] Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.253720 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-c4ba-account-create-update-47gmh"] Jan 20 19:20:13 crc kubenswrapper[4773]: W0120 19:20:13.296128 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80285eae_2998_47ab_bcd6_e9905e2e71d4.slice/crio-adcb1dcabc420f72bf2c492539c5a7c129965c753d41de4d6421e450e3e9d1da WatchSource:0}: Error finding container adcb1dcabc420f72bf2c492539c5a7c129965c753d41de4d6421e450e3e9d1da: Status 404 returned error can't find the container with id adcb1dcabc420f72bf2c492539c5a7c129965c753d41de4d6421e450e3e9d1da Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.985071 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"466685e0-d49e-4d97-9436-7db7c10062c3","Type":"ContainerStarted","Data":"d055112d14f2e629a8e001df39cd4874e98611d7c257b4025b57d54a9eee8e14"} Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.990744 4773 generic.go:334] "Generic (PLEG): container finished" podID="33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" containerID="0827af644398a715247b27083b551541f42dae2b0a3150620bd9104ee37e5138" exitCode=0 Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.990865 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5vk9g" event={"ID":"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6","Type":"ContainerDied","Data":"0827af644398a715247b27083b551541f42dae2b0a3150620bd9104ee37e5138"} Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.990889 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5vk9g" event={"ID":"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6","Type":"ContainerStarted","Data":"b78ed55fd68aa09da51ca8d08d5d7237f7835a008ac40fd73221ce4e306eadee"} Jan 20 19:20:13 crc kubenswrapper[4773]: I0120 19:20:13.999816 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"117c4f3b-d438-4f73-966c-378c28f67460","Type":"ContainerStarted","Data":"c695521bd759e3089142a9939da612b16228606f6c17b308e73ac5f6f71b1916"} Jan 20 19:20:14 crc kubenswrapper[4773]: I0120 19:20:14.003018 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"4a053127-e129-429c-9a7b-28e084c34269","Type":"ContainerStarted","Data":"cd2494809157f0dc32b616d02961ab5a824cd251ab6ff4cf380d06eba77cf01b"} Jan 20 19:20:14 crc kubenswrapper[4773]: I0120 19:20:14.011057 4773 generic.go:334] "Generic (PLEG): container finished" podID="80285eae-2998-47ab-bcd6-e9905e2e71d4" containerID="572888ce06611feefec987174ae3eee5c299fc9038199817efe0a94604e5aae9" exitCode=0 Jan 20 19:20:14 crc kubenswrapper[4773]: I0120 19:20:14.011105 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-c4ba-account-create-update-47gmh" event={"ID":"80285eae-2998-47ab-bcd6-e9905e2e71d4","Type":"ContainerDied","Data":"572888ce06611feefec987174ae3eee5c299fc9038199817efe0a94604e5aae9"} Jan 20 19:20:14 crc kubenswrapper[4773]: I0120 19:20:14.011136 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-c4ba-account-create-update-47gmh" event={"ID":"80285eae-2998-47ab-bcd6-e9905e2e71d4","Type":"ContainerStarted","Data":"adcb1dcabc420f72bf2c492539c5a7c129965c753d41de4d6421e450e3e9d1da"} Jan 20 19:20:14 crc kubenswrapper[4773]: I0120 19:20:14.104568 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Jan 20 19:20:14 crc kubenswrapper[4773]: W0120 19:20:14.109475 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod696e3ee3_25fa_4102_b483_1781d00bb18f.slice/crio-82121dc2b7e9cd6f30c4e080bd93ac3e6030da3d13efae7c8f28633edf131251 WatchSource:0}: Error finding container 82121dc2b7e9cd6f30c4e080bd93ac3e6030da3d13efae7c8f28633edf131251: Status 404 returned error can't find the container with id 82121dc2b7e9cd6f30c4e080bd93ac3e6030da3d13efae7c8f28633edf131251 Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.032858 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"117c4f3b-d438-4f73-966c-378c28f67460","Type":"ContainerStarted","Data":"19fbddbc9b86f769303d79312c2d030f23e2ed3bedd010b7e3f1f8d4734deb27"} Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.033662 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"117c4f3b-d438-4f73-966c-378c28f67460","Type":"ContainerStarted","Data":"d6094097fb5b6521d0f9eab177f854c1600b0455dcf9c4a951eef9840bbaa33d"} Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.038739 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-backup-0" event={"ID":"4a053127-e129-429c-9a7b-28e084c34269","Type":"ContainerStarted","Data":"8c943ae399231c0185d23dd80b5c32282626c1d7c00d8630f9220b5d9186232c"} Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.047873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"696e3ee3-25fa-4102-b483-1781d00bb18f","Type":"ContainerStarted","Data":"3d61afbbdb37be4e5d0f6699b89507fbdbcb2725381c635c03801cd69764d168"} Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.047959 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"696e3ee3-25fa-4102-b483-1781d00bb18f","Type":"ContainerStarted","Data":"82121dc2b7e9cd6f30c4e080bd93ac3e6030da3d13efae7c8f28633edf131251"} Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.050796 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-volume-volume1-0" event={"ID":"466685e0-d49e-4d97-9436-7db7c10062c3","Type":"ContainerStarted","Data":"7920c95131e3c4f69ab4387125c62c826f431aab0589ad7e26f57c9b92057e19"} Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.086689 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=4.08666132 podStartE2EDuration="4.08666132s" podCreationTimestamp="2026-01-20 19:20:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:20:15.059584334 +0000 UTC m=+3007.981397378" watchObservedRunningTime="2026-01-20 19:20:15.08666132 +0000 UTC m=+3008.008474334" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.124584 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-volume-volume1-0" podStartSLOduration=3.149365459 podStartE2EDuration="4.124530739s" podCreationTimestamp="2026-01-20 19:20:11 +0000 UTC" firstStartedPulling="2026-01-20 19:20:12.677349746 +0000 UTC m=+3005.599162770" lastFinishedPulling="2026-01-20 19:20:13.652515026 +0000 UTC m=+3006.574328050" observedRunningTime="2026-01-20 19:20:15.079716212 +0000 UTC m=+3008.001529256" watchObservedRunningTime="2026-01-20 19:20:15.124530739 +0000 UTC m=+3008.046343763" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.147485 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-backup-0" podStartSLOduration=3.299791759 podStartE2EDuration="4.147456995s" podCreationTimestamp="2026-01-20 19:20:11 +0000 UTC" firstStartedPulling="2026-01-20 19:20:12.512091357 +0000 UTC m=+3005.433904381" lastFinishedPulling="2026-01-20 19:20:13.359756603 +0000 UTC m=+3006.281569617" observedRunningTime="2026-01-20 19:20:15.129556251 +0000 UTC m=+3008.051369275" watchObservedRunningTime="2026-01-20 19:20:15.147456995 +0000 UTC m=+3008.069270019" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.469427 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.476265 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.667979 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80285eae-2998-47ab-bcd6-e9905e2e71d4-operator-scripts\") pod \"80285eae-2998-47ab-bcd6-e9905e2e71d4\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.668108 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkg9v\" (UniqueName: \"kubernetes.io/projected/80285eae-2998-47ab-bcd6-e9905e2e71d4-kube-api-access-lkg9v\") pod \"80285eae-2998-47ab-bcd6-e9905e2e71d4\" (UID: \"80285eae-2998-47ab-bcd6-e9905e2e71d4\") " Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.668151 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-operator-scripts\") pod \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.668310 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfltt\" (UniqueName: \"kubernetes.io/projected/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-kube-api-access-sfltt\") pod \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\" (UID: \"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6\") " Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.668827 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" (UID: "33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.668827 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/80285eae-2998-47ab-bcd6-e9905e2e71d4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "80285eae-2998-47ab-bcd6-e9905e2e71d4" (UID: "80285eae-2998-47ab-bcd6-e9905e2e71d4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.670647 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/80285eae-2998-47ab-bcd6-e9905e2e71d4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.671137 4773 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.674828 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-kube-api-access-sfltt" (OuterVolumeSpecName: "kube-api-access-sfltt") pod "33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" (UID: "33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6"). InnerVolumeSpecName "kube-api-access-sfltt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.685241 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80285eae-2998-47ab-bcd6-e9905e2e71d4-kube-api-access-lkg9v" (OuterVolumeSpecName: "kube-api-access-lkg9v") pod "80285eae-2998-47ab-bcd6-e9905e2e71d4" (UID: "80285eae-2998-47ab-bcd6-e9905e2e71d4"). InnerVolumeSpecName "kube-api-access-lkg9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.773478 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lkg9v\" (UniqueName: \"kubernetes.io/projected/80285eae-2998-47ab-bcd6-e9905e2e71d4-kube-api-access-lkg9v\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:15 crc kubenswrapper[4773]: I0120 19:20:15.773520 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfltt\" (UniqueName: \"kubernetes.io/projected/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6-kube-api-access-sfltt\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.060018 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-create-5vk9g" event={"ID":"33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6","Type":"ContainerDied","Data":"b78ed55fd68aa09da51ca8d08d5d7237f7835a008ac40fd73221ce4e306eadee"} Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.060026 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-create-5vk9g" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.060890 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b78ed55fd68aa09da51ca8d08d5d7237f7835a008ac40fd73221ce4e306eadee" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.061864 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"696e3ee3-25fa-4102-b483-1781d00bb18f","Type":"ContainerStarted","Data":"f919d7cba009bbb481447339fe707497ba9577a667afaeccba3e1156016cc47c"} Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.065373 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-c4ba-account-create-update-47gmh" event={"ID":"80285eae-2998-47ab-bcd6-e9905e2e71d4","Type":"ContainerDied","Data":"adcb1dcabc420f72bf2c492539c5a7c129965c753d41de4d6421e450e3e9d1da"} Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.065470 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="adcb1dcabc420f72bf2c492539c5a7c129965c753d41de4d6421e450e3e9d1da" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.066350 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-c4ba-account-create-update-47gmh" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.089651 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.089628323 podStartE2EDuration="5.089628323s" podCreationTimestamp="2026-01-20 19:20:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:20:16.085124094 +0000 UTC m=+3009.006937138" watchObservedRunningTime="2026-01-20 19:20:16.089628323 +0000 UTC m=+3009.011441348" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.446886 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:20:16 crc kubenswrapper[4773]: E0120 19:20:16.447453 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.740829 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-backup-0" Jan 20 19:20:16 crc kubenswrapper[4773]: I0120 19:20:16.740953 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.625494 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-db-sync-r2zvz"] Jan 20 19:20:17 crc kubenswrapper[4773]: E0120 19:20:17.626434 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" containerName="mariadb-database-create" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.626457 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" containerName="mariadb-database-create" Jan 20 19:20:17 crc kubenswrapper[4773]: E0120 19:20:17.626522 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80285eae-2998-47ab-bcd6-e9905e2e71d4" containerName="mariadb-account-create-update" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.626536 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="80285eae-2998-47ab-bcd6-e9905e2e71d4" containerName="mariadb-account-create-update" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.626841 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="80285eae-2998-47ab-bcd6-e9905e2e71d4" containerName="mariadb-account-create-update" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.626876 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" containerName="mariadb-database-create" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.627801 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.629788 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-qgfd7" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.629838 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.642473 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-r2zvz"] Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.710448 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-job-config-data\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.710599 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-config-data\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.710664 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-combined-ca-bundle\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.710757 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95wx9\" (UniqueName: \"kubernetes.io/projected/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-kube-api-access-95wx9\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.812464 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-config-data\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.812537 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-combined-ca-bundle\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.812650 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95wx9\" (UniqueName: \"kubernetes.io/projected/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-kube-api-access-95wx9\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.812713 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-job-config-data\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.821335 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-config-data\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.830423 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-job-config-data\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.836740 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95wx9\" (UniqueName: \"kubernetes.io/projected/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-kube-api-access-95wx9\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.840572 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-combined-ca-bundle\") pod \"manila-db-sync-r2zvz\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:17 crc kubenswrapper[4773]: I0120 19:20:17.946579 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:18 crc kubenswrapper[4773]: I0120 19:20:18.525213 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-db-sync-r2zvz"] Jan 20 19:20:19 crc kubenswrapper[4773]: I0120 19:20:19.091790 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-r2zvz" event={"ID":"32b245ce-84e1-4fbc-adef-ebfdd1e88d77","Type":"ContainerStarted","Data":"0539b6653066f272764ca28da046994c24db6fca84f59c015dd3f0893b721f4a"} Jan 20 19:20:21 crc kubenswrapper[4773]: I0120 19:20:21.885453 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-backup-0" Jan 20 19:20:21 crc kubenswrapper[4773]: I0120 19:20:21.911569 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-volume-volume1-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.412888 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.413294 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.444532 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.462715 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.490158 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.491960 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.527401 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:22 crc kubenswrapper[4773]: I0120 19:20:22.556050 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:23 crc kubenswrapper[4773]: I0120 19:20:23.132738 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-r2zvz" event={"ID":"32b245ce-84e1-4fbc-adef-ebfdd1e88d77","Type":"ContainerStarted","Data":"4557e82493f1d248bc3cabcdf01505516b90cb0e92c1b3e5eff438a70402a241"} Jan 20 19:20:23 crc kubenswrapper[4773]: I0120 19:20:23.133162 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 20 19:20:23 crc kubenswrapper[4773]: I0120 19:20:23.134203 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:23 crc kubenswrapper[4773]: I0120 19:20:23.134233 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Jan 20 19:20:23 crc kubenswrapper[4773]: I0120 19:20:23.134244 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:23 crc kubenswrapper[4773]: I0120 19:20:23.175869 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-db-sync-r2zvz" podStartSLOduration=2.283586094 podStartE2EDuration="6.175845676s" podCreationTimestamp="2026-01-20 19:20:17 +0000 UTC" firstStartedPulling="2026-01-20 19:20:18.527919281 +0000 UTC m=+3011.449732305" lastFinishedPulling="2026-01-20 19:20:22.420178863 +0000 UTC m=+3015.341991887" observedRunningTime="2026-01-20 19:20:23.162687707 +0000 UTC m=+3016.084500731" watchObservedRunningTime="2026-01-20 19:20:23.175845676 +0000 UTC m=+3016.097658700" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.147157 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.147461 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.147241 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.147502 4773 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.215735 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.220257 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.238841 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Jan 20 19:20:25 crc kubenswrapper[4773]: I0120 19:20:25.272982 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Jan 20 19:20:29 crc kubenswrapper[4773]: I0120 19:20:29.446865 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:20:29 crc kubenswrapper[4773]: E0120 19:20:29.447710 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.508435 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-6rbwt"] Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.511413 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.521138 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6rbwt"] Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.602956 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-catalog-content\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.603341 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-utilities\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.603418 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnlgz\" (UniqueName: \"kubernetes.io/projected/b74b1a0f-c97d-4491-94a5-4429140cd990-kube-api-access-rnlgz\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.704864 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-catalog-content\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.704911 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-utilities\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.705042 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnlgz\" (UniqueName: \"kubernetes.io/projected/b74b1a0f-c97d-4491-94a5-4429140cd990-kube-api-access-rnlgz\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.705384 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-catalog-content\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.705471 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-utilities\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.732341 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnlgz\" (UniqueName: \"kubernetes.io/projected/b74b1a0f-c97d-4491-94a5-4429140cd990-kube-api-access-rnlgz\") pod \"certified-operators-6rbwt\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:32 crc kubenswrapper[4773]: I0120 19:20:32.839128 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:33 crc kubenswrapper[4773]: I0120 19:20:33.294416 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-6rbwt"] Jan 20 19:20:34 crc kubenswrapper[4773]: I0120 19:20:34.224076 4773 generic.go:334] "Generic (PLEG): container finished" podID="32b245ce-84e1-4fbc-adef-ebfdd1e88d77" containerID="4557e82493f1d248bc3cabcdf01505516b90cb0e92c1b3e5eff438a70402a241" exitCode=0 Jan 20 19:20:34 crc kubenswrapper[4773]: I0120 19:20:34.224156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-r2zvz" event={"ID":"32b245ce-84e1-4fbc-adef-ebfdd1e88d77","Type":"ContainerDied","Data":"4557e82493f1d248bc3cabcdf01505516b90cb0e92c1b3e5eff438a70402a241"} Jan 20 19:20:34 crc kubenswrapper[4773]: I0120 19:20:34.226627 4773 generic.go:334] "Generic (PLEG): container finished" podID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerID="0dcbbc8c658f47d7a18832e97e00c7168314f5b7519c1ce7c50ec20d81f74854" exitCode=0 Jan 20 19:20:34 crc kubenswrapper[4773]: I0120 19:20:34.226661 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerDied","Data":"0dcbbc8c658f47d7a18832e97e00c7168314f5b7519c1ce7c50ec20d81f74854"} Jan 20 19:20:34 crc kubenswrapper[4773]: I0120 19:20:34.226685 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerStarted","Data":"310cc51704b18b2aa90c2ac829fa16af6930efbf07c19a5e76754a3192b7d35b"} Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.237154 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerStarted","Data":"4e61b2ec36fa13c02a3a29199aed50235c5ded5e007e514d5c73d81221f1e72c"} Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.637221 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.767920 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-95wx9\" (UniqueName: \"kubernetes.io/projected/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-kube-api-access-95wx9\") pod \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.768163 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-combined-ca-bundle\") pod \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.769041 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-config-data\") pod \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.769315 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-job-config-data\") pod \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\" (UID: \"32b245ce-84e1-4fbc-adef-ebfdd1e88d77\") " Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.782105 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-config-data" (OuterVolumeSpecName: "config-data") pod "32b245ce-84e1-4fbc-adef-ebfdd1e88d77" (UID: "32b245ce-84e1-4fbc-adef-ebfdd1e88d77"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.782379 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-kube-api-access-95wx9" (OuterVolumeSpecName: "kube-api-access-95wx9") pod "32b245ce-84e1-4fbc-adef-ebfdd1e88d77" (UID: "32b245ce-84e1-4fbc-adef-ebfdd1e88d77"). InnerVolumeSpecName "kube-api-access-95wx9". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.783393 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-job-config-data" (OuterVolumeSpecName: "job-config-data") pod "32b245ce-84e1-4fbc-adef-ebfdd1e88d77" (UID: "32b245ce-84e1-4fbc-adef-ebfdd1e88d77"). InnerVolumeSpecName "job-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.806984 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "32b245ce-84e1-4fbc-adef-ebfdd1e88d77" (UID: "32b245ce-84e1-4fbc-adef-ebfdd1e88d77"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.872357 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.872390 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.872400 4773 reconciler_common.go:293] "Volume detached for volume \"job-config-data\" (UniqueName: \"kubernetes.io/secret/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-job-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:35 crc kubenswrapper[4773]: I0120 19:20:35.872420 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-95wx9\" (UniqueName: \"kubernetes.io/projected/32b245ce-84e1-4fbc-adef-ebfdd1e88d77-kube-api-access-95wx9\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.247819 4773 generic.go:334] "Generic (PLEG): container finished" podID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerID="4e61b2ec36fa13c02a3a29199aed50235c5ded5e007e514d5c73d81221f1e72c" exitCode=0 Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.247873 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerDied","Data":"4e61b2ec36fa13c02a3a29199aed50235c5ded5e007e514d5c73d81221f1e72c"} Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.249409 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-db-sync-r2zvz" event={"ID":"32b245ce-84e1-4fbc-adef-ebfdd1e88d77","Type":"ContainerDied","Data":"0539b6653066f272764ca28da046994c24db6fca84f59c015dd3f0893b721f4a"} Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.249457 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0539b6653066f272764ca28da046994c24db6fca84f59c015dd3f0893b721f4a" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.249427 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-db-sync-r2zvz" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.614094 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:20:36 crc kubenswrapper[4773]: E0120 19:20:36.614767 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32b245ce-84e1-4fbc-adef-ebfdd1e88d77" containerName="manila-db-sync" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.614783 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="32b245ce-84e1-4fbc-adef-ebfdd1e88d77" containerName="manila-db-sync" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.614999 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="32b245ce-84e1-4fbc-adef-ebfdd1e88d77" containerName="manila-db-sync" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.616051 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.634463 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-config-data" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.634674 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.634782 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scripts" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.635626 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-manila-dockercfg-qgfd7" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.646572 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.648407 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.654965 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688639 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688671 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-ceph\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688687 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688709 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688730 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckqfk\" (UniqueName: \"kubernetes.io/projected/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-kube-api-access-ckqfk\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688757 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688772 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688791 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688819 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688860 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnctc\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-kube-api-access-gnctc\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688900 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688920 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-scripts\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.688955 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-scripts\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.689252 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.707722 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.747236 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-d8qrf"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.748877 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796037 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-d8qrf"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796846 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796892 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796923 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796966 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-ceph\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796983 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.796999 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797020 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckqfk\" (UniqueName: \"kubernetes.io/projected/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-kube-api-access-ckqfk\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797045 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797060 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797078 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797100 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wwlkd\" (UniqueName: \"kubernetes.io/projected/3b08b301-686b-45e6-9903-5df8a754a16a-kube-api-access-wwlkd\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797121 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797136 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797153 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797173 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797208 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnctc\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-kube-api-access-gnctc\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797248 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797269 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-scripts\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797286 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-scripts\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.797303 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-config\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.813538 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.813604 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.819984 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-scripts\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.820163 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.820707 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.821084 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.837471 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-scripts\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.839790 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.841730 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.852009 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.854574 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.860761 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckqfk\" (UniqueName: \"kubernetes.io/projected/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-kube-api-access-ckqfk\") pod \"manila-scheduler-0\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " pod="openstack/manila-scheduler-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.865376 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-ceph\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.882484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnctc\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-kube-api-access-gnctc\") pod \"manila-share-share1-0\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.900261 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-config\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.900361 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.900437 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wwlkd\" (UniqueName: \"kubernetes.io/projected/3b08b301-686b-45e6-9903-5df8a754a16a-kube-api-access-wwlkd\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.900459 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.900484 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.900521 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.901957 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-dns-svc\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.904116 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-ovsdbserver-sb\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.904127 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-config\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.907852 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-ovsdbserver-nb\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.912068 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.913669 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.917397 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.923537 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/3b08b301-686b-45e6-9903-5df8a754a16a-openstack-edpm-ipam\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.931901 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.949898 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.960466 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wwlkd\" (UniqueName: \"kubernetes.io/projected/3b08b301-686b-45e6-9903-5df8a754a16a-kube-api-access-wwlkd\") pod \"dnsmasq-dns-76b5fdb995-d8qrf\" (UID: \"3b08b301-686b-45e6-9903-5df8a754a16a\") " pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:36 crc kubenswrapper[4773]: I0120 19:20:36.998415 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001756 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-scripts\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001806 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6ljt\" (UniqueName: \"kubernetes.io/projected/51f938c6-86eb-414f-b3c7-47f8d4c1927d-kube-api-access-p6ljt\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001856 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001893 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51f938c6-86eb-414f-b3c7-47f8d4c1927d-etc-machine-id\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001912 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f938c6-86eb-414f-b3c7-47f8d4c1927d-logs\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001947 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.001965 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data-custom\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.096191 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103392 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103454 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51f938c6-86eb-414f-b3c7-47f8d4c1927d-etc-machine-id\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103480 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f938c6-86eb-414f-b3c7-47f8d4c1927d-logs\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103512 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103543 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data-custom\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103650 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-scripts\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.103706 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ljt\" (UniqueName: \"kubernetes.io/projected/51f938c6-86eb-414f-b3c7-47f8d4c1927d-kube-api-access-p6ljt\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.105871 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51f938c6-86eb-414f-b3c7-47f8d4c1927d-etc-machine-id\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.109520 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f938c6-86eb-414f-b3c7-47f8d4c1927d-logs\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.112327 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data-custom\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.113658 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-scripts\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.115677 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.116547 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.125246 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6ljt\" (UniqueName: \"kubernetes.io/projected/51f938c6-86eb-414f-b3c7-47f8d4c1927d-kube-api-access-p6ljt\") pod \"manila-api-0\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.266359 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerStarted","Data":"b8c0fb2789b63e3dd31b5f19a75f566cae4590359ccc3804cd2893038fbb867c"} Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.412679 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.649440 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-6rbwt" podStartSLOduration=3.044691933 podStartE2EDuration="5.649418858s" podCreationTimestamp="2026-01-20 19:20:32 +0000 UTC" firstStartedPulling="2026-01-20 19:20:34.228415268 +0000 UTC m=+3027.150228292" lastFinishedPulling="2026-01-20 19:20:36.833142193 +0000 UTC m=+3029.754955217" observedRunningTime="2026-01-20 19:20:37.307610864 +0000 UTC m=+3030.229423888" watchObservedRunningTime="2026-01-20 19:20:37.649418858 +0000 UTC m=+3030.571231882" Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.649775 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.677211 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:20:37 crc kubenswrapper[4773]: I0120 19:20:37.769629 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76b5fdb995-d8qrf"] Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.058410 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.287493 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8ac3cbb7-870d-49e0-b7f2-0996320eeea8","Type":"ContainerStarted","Data":"e3d14db8a2dd3006ff781195948b0be8cc655d85ea68cd28813883eb46dead06"} Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.294105 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"51f938c6-86eb-414f-b3c7-47f8d4c1927d","Type":"ContainerStarted","Data":"bfadeca77795b07ca8716eb0dc10af286da6021714d107d79cc84f7669939957"} Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.295552 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d","Type":"ContainerStarted","Data":"e678d9c3a7cacb6b1935cbb52e080eed45938d042984c3bc8838c3bad5e5d7f5"} Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.298125 4773 generic.go:334] "Generic (PLEG): container finished" podID="3b08b301-686b-45e6-9903-5df8a754a16a" containerID="2684a6d31019468fc8fe4bf2534508370b31e404ba23f254ba4e185118fd0f68" exitCode=0 Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.298215 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" event={"ID":"3b08b301-686b-45e6-9903-5df8a754a16a","Type":"ContainerDied","Data":"2684a6d31019468fc8fe4bf2534508370b31e404ba23f254ba4e185118fd0f68"} Jan 20 19:20:38 crc kubenswrapper[4773]: I0120 19:20:38.298266 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" event={"ID":"3b08b301-686b-45e6-9903-5df8a754a16a","Type":"ContainerStarted","Data":"a09f039bcd20d122d0d482943b9f6fb2ffb12284b2f35c88fb90e38ec1eb6193"} Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.325515 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" event={"ID":"3b08b301-686b-45e6-9903-5df8a754a16a","Type":"ContainerStarted","Data":"ef515e1fbbe927a18099f0424c7979916924865c785a079cedb358635604e9ee"} Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.326169 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.331492 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8ac3cbb7-870d-49e0-b7f2-0996320eeea8","Type":"ContainerStarted","Data":"e9b62705ca1a66cbca043ba3d1110685eab3ba1068bed7cdd0d78e68260c6d65"} Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.334725 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"51f938c6-86eb-414f-b3c7-47f8d4c1927d","Type":"ContainerStarted","Data":"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74"} Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.334777 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"51f938c6-86eb-414f-b3c7-47f8d4c1927d","Type":"ContainerStarted","Data":"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa"} Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.335754 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.362163 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" podStartSLOduration=3.362142821 podStartE2EDuration="3.362142821s" podCreationTimestamp="2026-01-20 19:20:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:20:39.357845766 +0000 UTC m=+3032.279658810" watchObservedRunningTime="2026-01-20 19:20:39.362142821 +0000 UTC m=+3032.283955845" Jan 20 19:20:39 crc kubenswrapper[4773]: I0120 19:20:39.385789 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=3.385768784 podStartE2EDuration="3.385768784s" podCreationTimestamp="2026-01-20 19:20:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:20:39.378291112 +0000 UTC m=+3032.300104136" watchObservedRunningTime="2026-01-20 19:20:39.385768784 +0000 UTC m=+3032.307581808" Jan 20 19:20:40 crc kubenswrapper[4773]: I0120 19:20:40.074415 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:41 crc kubenswrapper[4773]: I0120 19:20:41.357359 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8ac3cbb7-870d-49e0-b7f2-0996320eeea8","Type":"ContainerStarted","Data":"a8a377ee9c38ba3729bf7b56c91b6b6c2ff5e50fe5b486abb78d39d1eac11d4e"} Jan 20 19:20:41 crc kubenswrapper[4773]: I0120 19:20:41.357584 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api-log" containerID="cri-o://e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa" gracePeriod=30 Jan 20 19:20:41 crc kubenswrapper[4773]: I0120 19:20:41.357828 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-api-0" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api" containerID="cri-o://e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74" gracePeriod=30 Jan 20 19:20:41 crc kubenswrapper[4773]: I0120 19:20:41.389399 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=4.580657595 podStartE2EDuration="5.389377885s" podCreationTimestamp="2026-01-20 19:20:36 +0000 UTC" firstStartedPulling="2026-01-20 19:20:37.683296389 +0000 UTC m=+3030.605109413" lastFinishedPulling="2026-01-20 19:20:38.492016679 +0000 UTC m=+3031.413829703" observedRunningTime="2026-01-20 19:20:41.381863042 +0000 UTC m=+3034.303676086" watchObservedRunningTime="2026-01-20 19:20:41.389377885 +0000 UTC m=+3034.311190909" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.128526 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.255889 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data-custom\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.255968 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.256064 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f938c6-86eb-414f-b3c7-47f8d4c1927d-logs\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.256099 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6ljt\" (UniqueName: \"kubernetes.io/projected/51f938c6-86eb-414f-b3c7-47f8d4c1927d-kube-api-access-p6ljt\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.256172 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51f938c6-86eb-414f-b3c7-47f8d4c1927d-etc-machine-id\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.256195 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-scripts\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.256277 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-combined-ca-bundle\") pod \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\" (UID: \"51f938c6-86eb-414f-b3c7-47f8d4c1927d\") " Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.256669 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/51f938c6-86eb-414f-b3c7-47f8d4c1927d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.257197 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/51f938c6-86eb-414f-b3c7-47f8d4c1927d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.257459 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51f938c6-86eb-414f-b3c7-47f8d4c1927d-logs" (OuterVolumeSpecName: "logs") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.262207 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.265064 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51f938c6-86eb-414f-b3c7-47f8d4c1927d-kube-api-access-p6ljt" (OuterVolumeSpecName: "kube-api-access-p6ljt") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "kube-api-access-p6ljt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.284087 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-scripts" (OuterVolumeSpecName: "scripts") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.289195 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.319456 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data" (OuterVolumeSpecName: "config-data") pod "51f938c6-86eb-414f-b3c7-47f8d4c1927d" (UID: "51f938c6-86eb-414f-b3c7-47f8d4c1927d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.359644 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.359690 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.359703 4773 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/51f938c6-86eb-414f-b3c7-47f8d4c1927d-logs\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.359713 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6ljt\" (UniqueName: \"kubernetes.io/projected/51f938c6-86eb-414f-b3c7-47f8d4c1927d-kube-api-access-p6ljt\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.359727 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.359737 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/51f938c6-86eb-414f-b3c7-47f8d4c1927d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.378230 4773 generic.go:334] "Generic (PLEG): container finished" podID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerID="e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74" exitCode=0 Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.378267 4773 generic.go:334] "Generic (PLEG): container finished" podID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerID="e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa" exitCode=143 Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.379446 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.379517 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"51f938c6-86eb-414f-b3c7-47f8d4c1927d","Type":"ContainerDied","Data":"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74"} Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.379611 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"51f938c6-86eb-414f-b3c7-47f8d4c1927d","Type":"ContainerDied","Data":"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa"} Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.379647 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"51f938c6-86eb-414f-b3c7-47f8d4c1927d","Type":"ContainerDied","Data":"bfadeca77795b07ca8716eb0dc10af286da6021714d107d79cc84f7669939957"} Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.379667 4773 scope.go:117] "RemoveContainer" containerID="e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.411495 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.429851 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.443227 4773 scope.go:117] "RemoveContainer" containerID="e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.443972 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:42 crc kubenswrapper[4773]: E0120 19:20:42.444387 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.444407 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api" Jan 20 19:20:42 crc kubenswrapper[4773]: E0120 19:20:42.444423 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api-log" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.444431 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api-log" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.444651 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api-log" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.444672 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" containerName="manila-api" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.445711 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.448213 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-public-svc" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.448448 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-api-config-data" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.448610 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-manila-internal-svc" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.463849 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.562719 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563120 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-config-data\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563250 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-internal-tls-certs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563334 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9wz4\" (UniqueName: \"kubernetes.io/projected/6e180830-62c9-4473-9d6b-197fbe92af49-kube-api-access-z9wz4\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563420 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-scripts\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563464 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e180830-62c9-4473-9d6b-197fbe92af49-etc-machine-id\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563524 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e180830-62c9-4473-9d6b-197fbe92af49-logs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563592 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-config-data-custom\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.563624 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-public-tls-certs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666481 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-config-data-custom\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666518 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-public-tls-certs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666593 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666620 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-config-data\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666664 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-internal-tls-certs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666701 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z9wz4\" (UniqueName: \"kubernetes.io/projected/6e180830-62c9-4473-9d6b-197fbe92af49-kube-api-access-z9wz4\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666727 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-scripts\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666748 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e180830-62c9-4473-9d6b-197fbe92af49-etc-machine-id\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.666770 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e180830-62c9-4473-9d6b-197fbe92af49-logs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.667198 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6e180830-62c9-4473-9d6b-197fbe92af49-logs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.670793 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/6e180830-62c9-4473-9d6b-197fbe92af49-etc-machine-id\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.672637 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-config-data-custom\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.673021 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-internal-tls-certs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.674516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-public-tls-certs\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.675087 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-combined-ca-bundle\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.678108 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-scripts\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.687399 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6e180830-62c9-4473-9d6b-197fbe92af49-config-data\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.691145 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9wz4\" (UniqueName: \"kubernetes.io/projected/6e180830-62c9-4473-9d6b-197fbe92af49-kube-api-access-z9wz4\") pod \"manila-api-0\" (UID: \"6e180830-62c9-4473-9d6b-197fbe92af49\") " pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.767704 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-api-0" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.842564 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.842683 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:42 crc kubenswrapper[4773]: I0120 19:20:42.905038 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.061843 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.062470 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="sg-core" containerID="cri-o://f09054a284067e5b25f62a6df3ceace720d880ef94266ed85a89680db580bec0" gracePeriod=30 Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.062579 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="proxy-httpd" containerID="cri-o://6a115b4e7aa3521d1dcf6d1b5cdf12eb1355072e7aae73459e22f14c9325fda6" gracePeriod=30 Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.062638 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-notification-agent" containerID="cri-o://52e37d56e55649463d21a5ee03a0affc59f9b6f7acea87d0e99f08561b5bb305" gracePeriod=30 Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.062162 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-central-agent" containerID="cri-o://37639be04fa67337c4343ed582b7cba817862498c8f653737abe8b0ad324ee80" gracePeriod=30 Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.393255 4773 generic.go:334] "Generic (PLEG): container finished" podID="020e7117-149f-4a0d-aa81-a324df9db850" containerID="6a115b4e7aa3521d1dcf6d1b5cdf12eb1355072e7aae73459e22f14c9325fda6" exitCode=0 Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.393666 4773 generic.go:334] "Generic (PLEG): container finished" podID="020e7117-149f-4a0d-aa81-a324df9db850" containerID="f09054a284067e5b25f62a6df3ceace720d880ef94266ed85a89680db580bec0" exitCode=2 Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.393712 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerDied","Data":"6a115b4e7aa3521d1dcf6d1b5cdf12eb1355072e7aae73459e22f14c9325fda6"} Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.393801 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerDied","Data":"f09054a284067e5b25f62a6df3ceace720d880ef94266ed85a89680db580bec0"} Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.447538 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:20:43 crc kubenswrapper[4773]: E0120 19:20:43.447854 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.466300 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51f938c6-86eb-414f-b3c7-47f8d4c1927d" path="/var/lib/kubelet/pods/51f938c6-86eb-414f-b3c7-47f8d4c1927d/volumes" Jan 20 19:20:43 crc kubenswrapper[4773]: I0120 19:20:43.466974 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:44 crc kubenswrapper[4773]: I0120 19:20:44.430521 4773 generic.go:334] "Generic (PLEG): container finished" podID="020e7117-149f-4a0d-aa81-a324df9db850" containerID="37639be04fa67337c4343ed582b7cba817862498c8f653737abe8b0ad324ee80" exitCode=0 Jan 20 19:20:44 crc kubenswrapper[4773]: I0120 19:20:44.430561 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerDied","Data":"37639be04fa67337c4343ed582b7cba817862498c8f653737abe8b0ad324ee80"} Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.099844 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6rbwt"] Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.438794 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-6rbwt" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="registry-server" containerID="cri-o://b8c0fb2789b63e3dd31b5f19a75f566cae4590359ccc3804cd2893038fbb867c" gracePeriod=2 Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.785406 4773 scope.go:117] "RemoveContainer" containerID="e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74" Jan 20 19:20:45 crc kubenswrapper[4773]: E0120 19:20:45.786119 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74\": container with ID starting with e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74 not found: ID does not exist" containerID="e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.786151 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74"} err="failed to get container status \"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74\": rpc error: code = NotFound desc = could not find container \"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74\": container with ID starting with e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74 not found: ID does not exist" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.786175 4773 scope.go:117] "RemoveContainer" containerID="e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa" Jan 20 19:20:45 crc kubenswrapper[4773]: E0120 19:20:45.786405 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa\": container with ID starting with e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa not found: ID does not exist" containerID="e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.786433 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa"} err="failed to get container status \"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa\": rpc error: code = NotFound desc = could not find container \"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa\": container with ID starting with e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa not found: ID does not exist" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.786446 4773 scope.go:117] "RemoveContainer" containerID="e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.786725 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74"} err="failed to get container status \"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74\": rpc error: code = NotFound desc = could not find container \"e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74\": container with ID starting with e0b8bde6cba9595031b27c27fed37e6b7596e4359e7a163c349b38ba90a54a74 not found: ID does not exist" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.786773 4773 scope.go:117] "RemoveContainer" containerID="e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa" Jan 20 19:20:45 crc kubenswrapper[4773]: I0120 19:20:45.790559 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa"} err="failed to get container status \"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa\": rpc error: code = NotFound desc = could not find container \"e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa\": container with ID starting with e7259183f58f03fbfa73300bc9dacdce72c8d47fef262a5f11e886177e1e4baa not found: ID does not exist" Jan 20 19:20:46 crc kubenswrapper[4773]: I0120 19:20:46.449007 4773 generic.go:334] "Generic (PLEG): container finished" podID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerID="b8c0fb2789b63e3dd31b5f19a75f566cae4590359ccc3804cd2893038fbb867c" exitCode=0 Jan 20 19:20:46 crc kubenswrapper[4773]: I0120 19:20:46.449105 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerDied","Data":"b8c0fb2789b63e3dd31b5f19a75f566cae4590359ccc3804cd2893038fbb867c"} Jan 20 19:20:46 crc kubenswrapper[4773]: I0120 19:20:46.477700 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-api-0"] Jan 20 19:20:46 crc kubenswrapper[4773]: I0120 19:20:46.952913 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:46.999366 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.065017 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnlgz\" (UniqueName: \"kubernetes.io/projected/b74b1a0f-c97d-4491-94a5-4429140cd990-kube-api-access-rnlgz\") pod \"b74b1a0f-c97d-4491-94a5-4429140cd990\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.065133 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-catalog-content\") pod \"b74b1a0f-c97d-4491-94a5-4429140cd990\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.065153 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-utilities\") pod \"b74b1a0f-c97d-4491-94a5-4429140cd990\" (UID: \"b74b1a0f-c97d-4491-94a5-4429140cd990\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.067062 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-utilities" (OuterVolumeSpecName: "utilities") pod "b74b1a0f-c97d-4491-94a5-4429140cd990" (UID: "b74b1a0f-c97d-4491-94a5-4429140cd990"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.087071 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b74b1a0f-c97d-4491-94a5-4429140cd990-kube-api-access-rnlgz" (OuterVolumeSpecName: "kube-api-access-rnlgz") pod "b74b1a0f-c97d-4491-94a5-4429140cd990" (UID: "b74b1a0f-c97d-4491-94a5-4429140cd990"). InnerVolumeSpecName "kube-api-access-rnlgz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.098359 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-76b5fdb995-d8qrf" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.136147 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b74b1a0f-c97d-4491-94a5-4429140cd990" (UID: "b74b1a0f-c97d-4491-94a5-4429140cd990"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.169562 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnlgz\" (UniqueName: \"kubernetes.io/projected/b74b1a0f-c97d-4491-94a5-4429140cd990-kube-api-access-rnlgz\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.169604 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.169617 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b74b1a0f-c97d-4491-94a5-4429140cd990-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.184946 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-42g4p"] Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.185172 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerName="dnsmasq-dns" containerID="cri-o://0db8dcf4f0eb56d128fff23942f9569af545eaf303471ed6ae63a9dec8023480" gracePeriod=10 Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.507509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-6rbwt" event={"ID":"b74b1a0f-c97d-4491-94a5-4429140cd990","Type":"ContainerDied","Data":"310cc51704b18b2aa90c2ac829fa16af6930efbf07c19a5e76754a3192b7d35b"} Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.509306 4773 scope.go:117] "RemoveContainer" containerID="b8c0fb2789b63e3dd31b5f19a75f566cae4590359ccc3804cd2893038fbb867c" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.508190 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-6rbwt" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.522225 4773 generic.go:334] "Generic (PLEG): container finished" podID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerID="0db8dcf4f0eb56d128fff23942f9569af545eaf303471ed6ae63a9dec8023480" exitCode=0 Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.524339 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" event={"ID":"ba4ab073-f712-41fb-9b44-d83a19b72973","Type":"ContainerDied","Data":"0db8dcf4f0eb56d128fff23942f9569af545eaf303471ed6ae63a9dec8023480"} Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.550382 4773 generic.go:334] "Generic (PLEG): container finished" podID="020e7117-149f-4a0d-aa81-a324df9db850" containerID="52e37d56e55649463d21a5ee03a0affc59f9b6f7acea87d0e99f08561b5bb305" exitCode=0 Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.550459 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerDied","Data":"52e37d56e55649463d21a5ee03a0affc59f9b6f7acea87d0e99f08561b5bb305"} Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.563528 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6e180830-62c9-4473-9d6b-197fbe92af49","Type":"ContainerStarted","Data":"752f309be630a7024a48e3c7c1ac9dee313ddeae32bc1565e6b3ce4581064236"} Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.563844 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6e180830-62c9-4473-9d6b-197fbe92af49","Type":"ContainerStarted","Data":"c6753e6bfb6aac80cba0285abf73928e3b7f36843312135f77bb09ae13cf0264"} Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.564305 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.572072 4773 scope.go:117] "RemoveContainer" containerID="4e61b2ec36fa13c02a3a29199aed50235c5ded5e007e514d5c73d81221f1e72c" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.603391 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-6rbwt"] Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.617180 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-6rbwt"] Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.631269 4773 scope.go:117] "RemoveContainer" containerID="0dcbbc8c658f47d7a18832e97e00c7168314f5b7519c1ce7c50ec20d81f74854" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698208 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-config-data\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698315 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qhs6b\" (UniqueName: \"kubernetes.io/projected/020e7117-149f-4a0d-aa81-a324df9db850-kube-api-access-qhs6b\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698346 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-log-httpd\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698376 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-run-httpd\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698420 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-combined-ca-bundle\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698447 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-ceilometer-tls-certs\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698511 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-scripts\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.698564 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-sg-core-conf-yaml\") pod \"020e7117-149f-4a0d-aa81-a324df9db850\" (UID: \"020e7117-149f-4a0d-aa81-a324df9db850\") " Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.702109 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.708641 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.720515 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-scripts" (OuterVolumeSpecName: "scripts") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.721153 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/020e7117-149f-4a0d-aa81-a324df9db850-kube-api-access-qhs6b" (OuterVolumeSpecName: "kube-api-access-qhs6b") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "kube-api-access-qhs6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.739356 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.800508 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.800538 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/020e7117-149f-4a0d-aa81-a324df9db850-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.800547 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.800555 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.800565 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qhs6b\" (UniqueName: \"kubernetes.io/projected/020e7117-149f-4a0d-aa81-a324df9db850-kube-api-access-qhs6b\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.829360 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.881342 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.903223 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.903254 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:47 crc kubenswrapper[4773]: I0120 19:20:47.943126 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-config-data" (OuterVolumeSpecName: "config-data") pod "020e7117-149f-4a0d-aa81-a324df9db850" (UID: "020e7117-149f-4a0d-aa81-a324df9db850"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.005209 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/020e7117-149f-4a0d-aa81-a324df9db850-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.008029 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.106998 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m92nd\" (UniqueName: \"kubernetes.io/projected/ba4ab073-f712-41fb-9b44-d83a19b72973-kube-api-access-m92nd\") pod \"ba4ab073-f712-41fb-9b44-d83a19b72973\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.107183 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-sb\") pod \"ba4ab073-f712-41fb-9b44-d83a19b72973\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.107300 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-config\") pod \"ba4ab073-f712-41fb-9b44-d83a19b72973\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.107340 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-openstack-edpm-ipam\") pod \"ba4ab073-f712-41fb-9b44-d83a19b72973\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.107365 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-nb\") pod \"ba4ab073-f712-41fb-9b44-d83a19b72973\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.107420 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-dns-svc\") pod \"ba4ab073-f712-41fb-9b44-d83a19b72973\" (UID: \"ba4ab073-f712-41fb-9b44-d83a19b72973\") " Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.133901 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ba4ab073-f712-41fb-9b44-d83a19b72973-kube-api-access-m92nd" (OuterVolumeSpecName: "kube-api-access-m92nd") pod "ba4ab073-f712-41fb-9b44-d83a19b72973" (UID: "ba4ab073-f712-41fb-9b44-d83a19b72973"). InnerVolumeSpecName "kube-api-access-m92nd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.181495 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "ba4ab073-f712-41fb-9b44-d83a19b72973" (UID: "ba4ab073-f712-41fb-9b44-d83a19b72973"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.187839 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-config" (OuterVolumeSpecName: "config") pod "ba4ab073-f712-41fb-9b44-d83a19b72973" (UID: "ba4ab073-f712-41fb-9b44-d83a19b72973"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.210017 4773 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-config\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.210047 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m92nd\" (UniqueName: \"kubernetes.io/projected/ba4ab073-f712-41fb-9b44-d83a19b72973-kube-api-access-m92nd\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.210059 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.211622 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "ba4ab073-f712-41fb-9b44-d83a19b72973" (UID: "ba4ab073-f712-41fb-9b44-d83a19b72973"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.215163 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "ba4ab073-f712-41fb-9b44-d83a19b72973" (UID: "ba4ab073-f712-41fb-9b44-d83a19b72973"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.234185 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "ba4ab073-f712-41fb-9b44-d83a19b72973" (UID: "ba4ab073-f712-41fb-9b44-d83a19b72973"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.312493 4773 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.312528 4773 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.312539 4773 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/ba4ab073-f712-41fb-9b44-d83a19b72973-dns-svc\") on node \"crc\" DevicePath \"\"" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.576176 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"020e7117-149f-4a0d-aa81-a324df9db850","Type":"ContainerDied","Data":"69446a8d0d2a42de6e148590cfbcb0a1f5f08dfbfef8edbc94698b1b5257bf49"} Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.576543 4773 scope.go:117] "RemoveContainer" containerID="6a115b4e7aa3521d1dcf6d1b5cdf12eb1355072e7aae73459e22f14c9325fda6" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.576199 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.578583 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-api-0" event={"ID":"6e180830-62c9-4473-9d6b-197fbe92af49","Type":"ContainerStarted","Data":"e792910914167205c550decfabf0eb54ec47edaa41002fc235e15f1e3092cfe1"} Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.578782 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/manila-api-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.584166 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d","Type":"ContainerStarted","Data":"5218faf8859144732559d39cbfc5f8bf6a362ff0d193e4e56cf27aa6333c88a2"} Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.584234 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d","Type":"ContainerStarted","Data":"14aacb800b101e1a1fb0b9dfaf22056f26452eb3f76d7412ae0f62db3b1dfb0b"} Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.599924 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" event={"ID":"ba4ab073-f712-41fb-9b44-d83a19b72973","Type":"ContainerDied","Data":"22400a6c79ad5ab061f9a63ebc6202a7ab7e2454b3e31cdcb85e885543c42eb5"} Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.600041 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-864d5fc68c-42g4p" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.609076 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-api-0" podStartSLOduration=6.609050824 podStartE2EDuration="6.609050824s" podCreationTimestamp="2026-01-20 19:20:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:20:48.604663158 +0000 UTC m=+3041.526476212" watchObservedRunningTime="2026-01-20 19:20:48.609050824 +0000 UTC m=+3041.530863848" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.633822 4773 scope.go:117] "RemoveContainer" containerID="f09054a284067e5b25f62a6df3ceace720d880ef94266ed85a89680db580bec0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.650941 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=4.404382468 podStartE2EDuration="12.65090948s" podCreationTimestamp="2026-01-20 19:20:36 +0000 UTC" firstStartedPulling="2026-01-20 19:20:37.659294777 +0000 UTC m=+3030.581107801" lastFinishedPulling="2026-01-20 19:20:45.905821789 +0000 UTC m=+3038.827634813" observedRunningTime="2026-01-20 19:20:48.63980197 +0000 UTC m=+3041.561614994" watchObservedRunningTime="2026-01-20 19:20:48.65090948 +0000 UTC m=+3041.572722504" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.677173 4773 scope.go:117] "RemoveContainer" containerID="52e37d56e55649463d21a5ee03a0affc59f9b6f7acea87d0e99f08561b5bb305" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.705119 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-42g4p"] Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.722466 4773 scope.go:117] "RemoveContainer" containerID="37639be04fa67337c4343ed582b7cba817862498c8f653737abe8b0ad324ee80" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.724227 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-864d5fc68c-42g4p"] Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.732683 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.742966 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.748674 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749183 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="sg-core" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749284 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="sg-core" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749345 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerName="dnsmasq-dns" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749402 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerName="dnsmasq-dns" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749461 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerName="init" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749510 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerName="init" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749565 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="extract-utilities" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749619 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="extract-utilities" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749685 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="registry-server" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749740 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="registry-server" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749798 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="extract-content" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749867 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="extract-content" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.749944 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-central-agent" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.749998 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-central-agent" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.750070 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="proxy-httpd" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750125 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="proxy-httpd" Jan 20 19:20:48 crc kubenswrapper[4773]: E0120 19:20:48.750183 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-notification-agent" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750234 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-notification-agent" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750447 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-notification-agent" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750529 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="proxy-httpd" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750596 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" containerName="dnsmasq-dns" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750653 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" containerName="registry-server" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750707 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="sg-core" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.750770 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="020e7117-149f-4a0d-aa81-a324df9db850" containerName="ceilometer-central-agent" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.752531 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.756759 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.757882 4773 scope.go:117] "RemoveContainer" containerID="0db8dcf4f0eb56d128fff23942f9569af545eaf303471ed6ae63a9dec8023480" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.758169 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.761070 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.761663 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.797648 4773 scope.go:117] "RemoveContainer" containerID="f7aec563576030ac1c13e7cfb223eea2c4098b2ad34114c7a7b21eb120d4d273" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.827928 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828006 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-log-httpd\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828078 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-run-httpd\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828209 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828243 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-scripts\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828345 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-config-data\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828416 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/1a597841-2c16-4b79-8e39-a24ff2d90b49-kube-api-access-zs9t7\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.828455 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930187 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930288 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-log-httpd\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930342 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-run-httpd\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930413 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930446 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-scripts\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930495 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-config-data\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930535 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/1a597841-2c16-4b79-8e39-a24ff2d90b49-kube-api-access-zs9t7\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.930565 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.931131 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-log-httpd\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.931424 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-run-httpd\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.936139 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.936516 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-config-data\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.936980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-scripts\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.937980 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.947945 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:48 crc kubenswrapper[4773]: I0120 19:20:48.952527 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/1a597841-2c16-4b79-8e39-a24ff2d90b49-kube-api-access-zs9t7\") pod \"ceilometer-0\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " pod="openstack/ceilometer-0" Jan 20 19:20:49 crc kubenswrapper[4773]: I0120 19:20:49.078661 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:20:49 crc kubenswrapper[4773]: I0120 19:20:49.460727 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="020e7117-149f-4a0d-aa81-a324df9db850" path="/var/lib/kubelet/pods/020e7117-149f-4a0d-aa81-a324df9db850/volumes" Jan 20 19:20:49 crc kubenswrapper[4773]: I0120 19:20:49.462164 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b74b1a0f-c97d-4491-94a5-4429140cd990" path="/var/lib/kubelet/pods/b74b1a0f-c97d-4491-94a5-4429140cd990/volumes" Jan 20 19:20:49 crc kubenswrapper[4773]: I0120 19:20:49.463477 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ba4ab073-f712-41fb-9b44-d83a19b72973" path="/var/lib/kubelet/pods/ba4ab073-f712-41fb-9b44-d83a19b72973/volumes" Jan 20 19:20:49 crc kubenswrapper[4773]: W0120 19:20:49.567834 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1a597841_2c16_4b79_8e39_a24ff2d90b49.slice/crio-6774a73a4d951c79a43b7e4562a94a61886a4da460da3e6bed28b7ceb8bb86b1 WatchSource:0}: Error finding container 6774a73a4d951c79a43b7e4562a94a61886a4da460da3e6bed28b7ceb8bb86b1: Status 404 returned error can't find the container with id 6774a73a4d951c79a43b7e4562a94a61886a4da460da3e6bed28b7ceb8bb86b1 Jan 20 19:20:49 crc kubenswrapper[4773]: I0120 19:20:49.570389 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:49 crc kubenswrapper[4773]: I0120 19:20:49.610597 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerStarted","Data":"6774a73a4d951c79a43b7e4562a94a61886a4da460da3e6bed28b7ceb8bb86b1"} Jan 20 19:20:50 crc kubenswrapper[4773]: I0120 19:20:50.073802 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:20:50 crc kubenswrapper[4773]: I0120 19:20:50.620137 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerStarted","Data":"44bfa4d5ee73c6110abe9d9ffa062115b661f9392f73893cfdf5ef024aa827b7"} Jan 20 19:20:51 crc kubenswrapper[4773]: I0120 19:20:51.629589 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerStarted","Data":"747d09b90e2ab57cad81c237ca27387dacfc078b498ae7776c8c22a03bb31eaa"} Jan 20 19:20:52 crc kubenswrapper[4773]: I0120 19:20:52.640579 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerStarted","Data":"19c82c253df944f670efa2fc1a28cafae34dbbb9d05d10020f22864b289378b2"} Jan 20 19:20:54 crc kubenswrapper[4773]: I0120 19:20:54.447258 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:20:54 crc kubenswrapper[4773]: E0120 19:20:54.447757 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:20:56 crc kubenswrapper[4773]: I0120 19:20:56.950828 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Jan 20 19:20:58 crc kubenswrapper[4773]: I0120 19:20:58.489957 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Jan 20 19:20:58 crc kubenswrapper[4773]: I0120 19:20:58.549357 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:20:58 crc kubenswrapper[4773]: I0120 19:20:58.695950 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="probe" containerID="cri-o://a8a377ee9c38ba3729bf7b56c91b6b6c2ff5e50fe5b486abb78d39d1eac11d4e" gracePeriod=30 Jan 20 19:20:58 crc kubenswrapper[4773]: I0120 19:20:58.695815 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-scheduler-0" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="manila-scheduler" containerID="cri-o://e9b62705ca1a66cbca043ba3d1110685eab3ba1068bed7cdd0d78e68260c6d65" gracePeriod=30 Jan 20 19:20:59 crc kubenswrapper[4773]: I0120 19:20:59.706882 4773 generic.go:334] "Generic (PLEG): container finished" podID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerID="a8a377ee9c38ba3729bf7b56c91b6b6c2ff5e50fe5b486abb78d39d1eac11d4e" exitCode=0 Jan 20 19:20:59 crc kubenswrapper[4773]: I0120 19:20:59.706963 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8ac3cbb7-870d-49e0-b7f2-0996320eeea8","Type":"ContainerDied","Data":"a8a377ee9c38ba3729bf7b56c91b6b6c2ff5e50fe5b486abb78d39d1eac11d4e"} Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.734346 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerStarted","Data":"d454b4e0cc1684b94a32c76b0cffae3b33d8086e5f83434f186841707ec4e2f5"} Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.734970 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.734483 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-central-agent" containerID="cri-o://44bfa4d5ee73c6110abe9d9ffa062115b661f9392f73893cfdf5ef024aa827b7" gracePeriod=30 Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.734611 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-notification-agent" containerID="cri-o://747d09b90e2ab57cad81c237ca27387dacfc078b498ae7776c8c22a03bb31eaa" gracePeriod=30 Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.734627 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="proxy-httpd" containerID="cri-o://d454b4e0cc1684b94a32c76b0cffae3b33d8086e5f83434f186841707ec4e2f5" gracePeriod=30 Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.734598 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="sg-core" containerID="cri-o://19c82c253df944f670efa2fc1a28cafae34dbbb9d05d10020f22864b289378b2" gracePeriod=30 Jan 20 19:21:02 crc kubenswrapper[4773]: I0120 19:21:02.769308 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.507948975 podStartE2EDuration="14.769284443s" podCreationTimestamp="2026-01-20 19:20:48 +0000 UTC" firstStartedPulling="2026-01-20 19:20:49.570529041 +0000 UTC m=+3042.492342065" lastFinishedPulling="2026-01-20 19:21:01.831864509 +0000 UTC m=+3054.753677533" observedRunningTime="2026-01-20 19:21:02.762476978 +0000 UTC m=+3055.684290002" watchObservedRunningTime="2026-01-20 19:21:02.769284443 +0000 UTC m=+3055.691097477" Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.753634 4773 generic.go:334] "Generic (PLEG): container finished" podID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerID="e9b62705ca1a66cbca043ba3d1110685eab3ba1068bed7cdd0d78e68260c6d65" exitCode=0 Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.753724 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8ac3cbb7-870d-49e0-b7f2-0996320eeea8","Type":"ContainerDied","Data":"e9b62705ca1a66cbca043ba3d1110685eab3ba1068bed7cdd0d78e68260c6d65"} Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757107 4773 generic.go:334] "Generic (PLEG): container finished" podID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerID="d454b4e0cc1684b94a32c76b0cffae3b33d8086e5f83434f186841707ec4e2f5" exitCode=0 Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757126 4773 generic.go:334] "Generic (PLEG): container finished" podID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerID="19c82c253df944f670efa2fc1a28cafae34dbbb9d05d10020f22864b289378b2" exitCode=2 Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757134 4773 generic.go:334] "Generic (PLEG): container finished" podID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerID="747d09b90e2ab57cad81c237ca27387dacfc078b498ae7776c8c22a03bb31eaa" exitCode=0 Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757141 4773 generic.go:334] "Generic (PLEG): container finished" podID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerID="44bfa4d5ee73c6110abe9d9ffa062115b661f9392f73893cfdf5ef024aa827b7" exitCode=0 Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757158 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerDied","Data":"d454b4e0cc1684b94a32c76b0cffae3b33d8086e5f83434f186841707ec4e2f5"} Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757175 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerDied","Data":"19c82c253df944f670efa2fc1a28cafae34dbbb9d05d10020f22864b289378b2"} Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757185 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerDied","Data":"747d09b90e2ab57cad81c237ca27387dacfc078b498ae7776c8c22a03bb31eaa"} Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.757194 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerDied","Data":"44bfa4d5ee73c6110abe9d9ffa062115b661f9392f73893cfdf5ef024aa827b7"} Jan 20 19:21:03 crc kubenswrapper[4773]: I0120 19:21:03.994210 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.000237 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.165649 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/manila-api-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.167118 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/1a597841-2c16-4b79-8e39-a24ff2d90b49-kube-api-access-zs9t7\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.167353 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-scripts\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.167469 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckqfk\" (UniqueName: \"kubernetes.io/projected/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-kube-api-access-ckqfk\") pod \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.167955 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-combined-ca-bundle\") pod \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168006 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-sg-core-conf-yaml\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168149 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-run-httpd\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168183 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-config-data\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168234 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-combined-ca-bundle\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168260 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-scripts\") pod \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168327 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-etc-machine-id\") pod \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168377 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-ceilometer-tls-certs\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168420 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data\") pod \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168462 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data-custom\") pod \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\" (UID: \"8ac3cbb7-870d-49e0-b7f2-0996320eeea8\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.168498 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-log-httpd\") pod \"1a597841-2c16-4b79-8e39-a24ff2d90b49\" (UID: \"1a597841-2c16-4b79-8e39-a24ff2d90b49\") " Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.171245 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "8ac3cbb7-870d-49e0-b7f2-0996320eeea8" (UID: "8ac3cbb7-870d-49e0-b7f2-0996320eeea8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.172174 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.172417 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-scripts" (OuterVolumeSpecName: "scripts") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.173363 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.173642 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a597841-2c16-4b79-8e39-a24ff2d90b49-kube-api-access-zs9t7" (OuterVolumeSpecName: "kube-api-access-zs9t7") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "kube-api-access-zs9t7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.176113 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-kube-api-access-ckqfk" (OuterVolumeSpecName: "kube-api-access-ckqfk") pod "8ac3cbb7-870d-49e0-b7f2-0996320eeea8" (UID: "8ac3cbb7-870d-49e0-b7f2-0996320eeea8"). InnerVolumeSpecName "kube-api-access-ckqfk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.186605 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-scripts" (OuterVolumeSpecName: "scripts") pod "8ac3cbb7-870d-49e0-b7f2-0996320eeea8" (UID: "8ac3cbb7-870d-49e0-b7f2-0996320eeea8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.202983 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "8ac3cbb7-870d-49e0-b7f2-0996320eeea8" (UID: "8ac3cbb7-870d-49e0-b7f2-0996320eeea8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.228076 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273063 4773 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273092 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273100 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273110 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273118 4773 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/1a597841-2c16-4b79-8e39-a24ff2d90b49-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273126 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zs9t7\" (UniqueName: \"kubernetes.io/projected/1a597841-2c16-4b79-8e39-a24ff2d90b49-kube-api-access-zs9t7\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273134 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273142 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ckqfk\" (UniqueName: \"kubernetes.io/projected/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-kube-api-access-ckqfk\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.273150 4773 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.281374 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8ac3cbb7-870d-49e0-b7f2-0996320eeea8" (UID: "8ac3cbb7-870d-49e0-b7f2-0996320eeea8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.293791 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.309558 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.331350 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-config-data" (OuterVolumeSpecName: "config-data") pod "1a597841-2c16-4b79-8e39-a24ff2d90b49" (UID: "1a597841-2c16-4b79-8e39-a24ff2d90b49"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.347117 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data" (OuterVolumeSpecName: "config-data") pod "8ac3cbb7-870d-49e0-b7f2-0996320eeea8" (UID: "8ac3cbb7-870d-49e0-b7f2-0996320eeea8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.375135 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.375173 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.375183 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.375190 4773 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/1a597841-2c16-4b79-8e39-a24ff2d90b49-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.375199 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8ac3cbb7-870d-49e0-b7f2-0996320eeea8-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.765600 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"8ac3cbb7-870d-49e0-b7f2-0996320eeea8","Type":"ContainerDied","Data":"e3d14db8a2dd3006ff781195948b0be8cc655d85ea68cd28813883eb46dead06"} Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.765623 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.765657 4773 scope.go:117] "RemoveContainer" containerID="a8a377ee9c38ba3729bf7b56c91b6b6c2ff5e50fe5b486abb78d39d1eac11d4e" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.769517 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"1a597841-2c16-4b79-8e39-a24ff2d90b49","Type":"ContainerDied","Data":"6774a73a4d951c79a43b7e4562a94a61886a4da460da3e6bed28b7ceb8bb86b1"} Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.769600 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.802827 4773 scope.go:117] "RemoveContainer" containerID="e9b62705ca1a66cbca043ba3d1110685eab3ba1068bed7cdd0d78e68260c6d65" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.816514 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.827759 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.835891 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.845099 4773 scope.go:117] "RemoveContainer" containerID="d454b4e0cc1684b94a32c76b0cffae3b33d8086e5f83434f186841707ec4e2f5" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.851286 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.862992 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: E0120 19:21:04.863501 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="proxy-httpd" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863516 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="proxy-httpd" Jan 20 19:21:04 crc kubenswrapper[4773]: E0120 19:21:04.863527 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="probe" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863533 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="probe" Jan 20 19:21:04 crc kubenswrapper[4773]: E0120 19:21:04.863553 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="manila-scheduler" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863559 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="manila-scheduler" Jan 20 19:21:04 crc kubenswrapper[4773]: E0120 19:21:04.863572 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="sg-core" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863578 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="sg-core" Jan 20 19:21:04 crc kubenswrapper[4773]: E0120 19:21:04.863598 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-notification-agent" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863605 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-notification-agent" Jan 20 19:21:04 crc kubenswrapper[4773]: E0120 19:21:04.863628 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-central-agent" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863634 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-central-agent" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863833 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-notification-agent" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863846 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="sg-core" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863862 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="ceilometer-central-agent" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863875 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" containerName="proxy-httpd" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863886 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="manila-scheduler" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.863893 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" containerName="probe" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.865149 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.876907 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.885436 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.889667 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.894615 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.901205 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.901465 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.901599 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.901775 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-scheduler-config-data" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.980011 4773 scope.go:117] "RemoveContainer" containerID="19c82c253df944f670efa2fc1a28cafae34dbbb9d05d10020f22864b289378b2" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990535 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990578 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfd9s\" (UniqueName: \"kubernetes.io/projected/6165543b-5cc4-4a1c-bbba-ed4621838073-kube-api-access-dfd9s\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990601 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990638 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-config-data\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990663 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990680 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cmk5\" (UniqueName: \"kubernetes.io/projected/e42acb5b-abbc-4f06-918d-2e886b50146e-kube-api-access-5cmk5\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990716 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990741 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-scripts\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990759 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e42acb5b-abbc-4f06-918d-2e886b50146e-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990779 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-scripts\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990809 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165543b-5cc4-4a1c-bbba-ed4621838073-run-httpd\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990851 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165543b-5cc4-4a1c-bbba-ed4621838073-log-httpd\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990869 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-config-data\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:04 crc kubenswrapper[4773]: I0120 19:21:04.990905 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.009702 4773 scope.go:117] "RemoveContainer" containerID="747d09b90e2ab57cad81c237ca27387dacfc078b498ae7776c8c22a03bb31eaa" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.031874 4773 scope.go:117] "RemoveContainer" containerID="44bfa4d5ee73c6110abe9d9ffa062115b661f9392f73893cfdf5ef024aa827b7" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093089 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165543b-5cc4-4a1c-bbba-ed4621838073-log-httpd\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093143 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-config-data\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093193 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093240 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093275 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfd9s\" (UniqueName: \"kubernetes.io/projected/6165543b-5cc4-4a1c-bbba-ed4621838073-kube-api-access-dfd9s\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093308 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093355 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-config-data\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093393 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093418 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cmk5\" (UniqueName: \"kubernetes.io/projected/e42acb5b-abbc-4f06-918d-2e886b50146e-kube-api-access-5cmk5\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093458 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093489 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-scripts\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093515 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e42acb5b-abbc-4f06-918d-2e886b50146e-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093539 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-scripts\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.093568 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165543b-5cc4-4a1c-bbba-ed4621838073-run-httpd\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.094177 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165543b-5cc4-4a1c-bbba-ed4621838073-run-httpd\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.095402 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/6165543b-5cc4-4a1c-bbba-ed4621838073-log-httpd\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.095554 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e42acb5b-abbc-4f06-918d-2e886b50146e-etc-machine-id\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.099806 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-config-data\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.100018 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-config-data-custom\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.100484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-combined-ca-bundle\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.100592 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-config-data\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.101162 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.101553 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.102044 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.109503 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6165543b-5cc4-4a1c-bbba-ed4621838073-scripts\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.110328 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e42acb5b-abbc-4f06-918d-2e886b50146e-scripts\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.113389 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cmk5\" (UniqueName: \"kubernetes.io/projected/e42acb5b-abbc-4f06-918d-2e886b50146e-kube-api-access-5cmk5\") pod \"manila-scheduler-0\" (UID: \"e42acb5b-abbc-4f06-918d-2e886b50146e\") " pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.114275 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfd9s\" (UniqueName: \"kubernetes.io/projected/6165543b-5cc4-4a1c-bbba-ed4621838073-kube-api-access-dfd9s\") pod \"ceilometer-0\" (UID: \"6165543b-5cc4-4a1c-bbba-ed4621838073\") " pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.279194 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-scheduler-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.291211 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.474520 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a597841-2c16-4b79-8e39-a24ff2d90b49" path="/var/lib/kubelet/pods/1a597841-2c16-4b79-8e39-a24ff2d90b49/volumes" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.475510 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ac3cbb7-870d-49e0-b7f2-0996320eeea8" path="/var/lib/kubelet/pods/8ac3cbb7-870d-49e0-b7f2-0996320eeea8/volumes" Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.734156 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-scheduler-0"] Jan 20 19:21:05 crc kubenswrapper[4773]: W0120 19:21:05.745419 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode42acb5b_abbc_4f06_918d_2e886b50146e.slice/crio-c16da8284b31947e976e8295a0500383367b8587c3d053db358bdcd0be73ac6b WatchSource:0}: Error finding container c16da8284b31947e976e8295a0500383367b8587c3d053db358bdcd0be73ac6b: Status 404 returned error can't find the container with id c16da8284b31947e976e8295a0500383367b8587c3d053db358bdcd0be73ac6b Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.747258 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Jan 20 19:21:05 crc kubenswrapper[4773]: W0120 19:21:05.753357 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6165543b_5cc4_4a1c_bbba_ed4621838073.slice/crio-ec1f0a11b09870b70b244f7d4df712e2b689a9da9202a09c52767756210dd66f WatchSource:0}: Error finding container ec1f0a11b09870b70b244f7d4df712e2b689a9da9202a09c52767756210dd66f: Status 404 returned error can't find the container with id ec1f0a11b09870b70b244f7d4df712e2b689a9da9202a09c52767756210dd66f Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.779989 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165543b-5cc4-4a1c-bbba-ed4621838073","Type":"ContainerStarted","Data":"ec1f0a11b09870b70b244f7d4df712e2b689a9da9202a09c52767756210dd66f"} Jan 20 19:21:05 crc kubenswrapper[4773]: I0120 19:21:05.782459 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e42acb5b-abbc-4f06-918d-2e886b50146e","Type":"ContainerStarted","Data":"c16da8284b31947e976e8295a0500383367b8587c3d053db358bdcd0be73ac6b"} Jan 20 19:21:06 crc kubenswrapper[4773]: I0120 19:21:06.791922 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e42acb5b-abbc-4f06-918d-2e886b50146e","Type":"ContainerStarted","Data":"8a0e690c09f50115173a74d7527e874019872d7d632b6061b6ba2960a4b3952d"} Jan 20 19:21:06 crc kubenswrapper[4773]: I0120 19:21:06.792533 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-scheduler-0" event={"ID":"e42acb5b-abbc-4f06-918d-2e886b50146e","Type":"ContainerStarted","Data":"17b2ab04a9549b2f0546229ffc85d19d8af608a7b04abc6aa2309dd798a369fe"} Jan 20 19:21:06 crc kubenswrapper[4773]: I0120 19:21:06.794541 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165543b-5cc4-4a1c-bbba-ed4621838073","Type":"ContainerStarted","Data":"083383d1547e587f0b074637cbb9d6d0feba6bb4c700d5fa54a80b4301db7e7d"} Jan 20 19:21:06 crc kubenswrapper[4773]: I0120 19:21:06.828664 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-scheduler-0" podStartSLOduration=2.8286288490000002 podStartE2EDuration="2.828628849s" podCreationTimestamp="2026-01-20 19:21:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:21:06.814887175 +0000 UTC m=+3059.736700199" watchObservedRunningTime="2026-01-20 19:21:06.828628849 +0000 UTC m=+3059.750441873" Jan 20 19:21:07 crc kubenswrapper[4773]: I0120 19:21:07.454973 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:21:07 crc kubenswrapper[4773]: E0120 19:21:07.456632 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:21:07 crc kubenswrapper[4773]: I0120 19:21:07.814179 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165543b-5cc4-4a1c-bbba-ed4621838073","Type":"ContainerStarted","Data":"e4077bc7c096866d5e71987f9c937b260b3efdc756e499d52886e18fefed0f74"} Jan 20 19:21:08 crc kubenswrapper[4773]: I0120 19:21:08.805452 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Jan 20 19:21:08 crc kubenswrapper[4773]: I0120 19:21:08.830910 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165543b-5cc4-4a1c-bbba-ed4621838073","Type":"ContainerStarted","Data":"8630a3365c6d2acfb95c5cab91c85213bdc8aab39e00b97c939992e97f5d8c38"} Jan 20 19:21:08 crc kubenswrapper[4773]: I0120 19:21:08.869951 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:21:08 crc kubenswrapper[4773]: I0120 19:21:08.870404 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="probe" containerID="cri-o://5218faf8859144732559d39cbfc5f8bf6a362ff0d193e4e56cf27aa6333c88a2" gracePeriod=30 Jan 20 19:21:08 crc kubenswrapper[4773]: I0120 19:21:08.870302 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/manila-share-share1-0" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="manila-share" containerID="cri-o://14aacb800b101e1a1fb0b9dfaf22056f26452eb3f76d7412ae0f62db3b1dfb0b" gracePeriod=30 Jan 20 19:21:09 crc kubenswrapper[4773]: I0120 19:21:09.854983 4773 generic.go:334] "Generic (PLEG): container finished" podID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerID="5218faf8859144732559d39cbfc5f8bf6a362ff0d193e4e56cf27aa6333c88a2" exitCode=0 Jan 20 19:21:09 crc kubenswrapper[4773]: I0120 19:21:09.855396 4773 generic.go:334] "Generic (PLEG): container finished" podID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerID="14aacb800b101e1a1fb0b9dfaf22056f26452eb3f76d7412ae0f62db3b1dfb0b" exitCode=1 Jan 20 19:21:09 crc kubenswrapper[4773]: I0120 19:21:09.855060 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d","Type":"ContainerDied","Data":"5218faf8859144732559d39cbfc5f8bf6a362ff0d193e4e56cf27aa6333c88a2"} Jan 20 19:21:09 crc kubenswrapper[4773]: I0120 19:21:09.855454 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d","Type":"ContainerDied","Data":"14aacb800b101e1a1fb0b9dfaf22056f26452eb3f76d7412ae0f62db3b1dfb0b"} Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.105655 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.199030 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-scripts\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.199079 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-ceph\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.199123 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnctc\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-kube-api-access-gnctc\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.200323 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.200410 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-var-lib-manila\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.200485 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data-custom\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.200582 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-etc-machine-id\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.200615 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-combined-ca-bundle\") pod \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\" (UID: \"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d\") " Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.200756 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-var-lib-manila" (OuterVolumeSpecName: "var-lib-manila") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "var-lib-manila". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.201084 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.201500 4773 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-etc-machine-id\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.201526 4773 reconciler_common.go:293] "Volume detached for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-var-lib-manila\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.204560 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-ceph" (OuterVolumeSpecName: "ceph") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "ceph". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.204626 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-kube-api-access-gnctc" (OuterVolumeSpecName: "kube-api-access-gnctc") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "kube-api-access-gnctc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.205513 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.209721 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-scripts" (OuterVolumeSpecName: "scripts") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.264482 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.303033 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data" (OuterVolumeSpecName: "config-data") pod "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" (UID: "03fcee75-e9f9-4ff2-a932-ad3cb7395d7d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.303054 4773 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-scripts\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.303133 4773 reconciler_common.go:293] "Volume detached for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-ceph\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.303149 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gnctc\" (UniqueName: \"kubernetes.io/projected/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-kube-api-access-gnctc\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.303164 4773 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data-custom\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.303175 4773 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.405604 4773 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d-config-data\") on node \"crc\" DevicePath \"\"" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.865145 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.865133 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"03fcee75-e9f9-4ff2-a932-ad3cb7395d7d","Type":"ContainerDied","Data":"e678d9c3a7cacb6b1935cbb52e080eed45938d042984c3bc8838c3bad5e5d7f5"} Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.865532 4773 scope.go:117] "RemoveContainer" containerID="5218faf8859144732559d39cbfc5f8bf6a362ff0d193e4e56cf27aa6333c88a2" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.867676 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"6165543b-5cc4-4a1c-bbba-ed4621838073","Type":"ContainerStarted","Data":"d6f3c97bb51378b787db3a4d41057fb99d5e4aed95bfb5b21d7e5d45a727e697"} Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.867836 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.890215 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.736253908 podStartE2EDuration="6.890193439s" podCreationTimestamp="2026-01-20 19:21:04 +0000 UTC" firstStartedPulling="2026-01-20 19:21:05.755599005 +0000 UTC m=+3058.677412029" lastFinishedPulling="2026-01-20 19:21:09.909538536 +0000 UTC m=+3062.831351560" observedRunningTime="2026-01-20 19:21:10.889376668 +0000 UTC m=+3063.811189712" watchObservedRunningTime="2026-01-20 19:21:10.890193439 +0000 UTC m=+3063.812006463" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.893059 4773 scope.go:117] "RemoveContainer" containerID="14aacb800b101e1a1fb0b9dfaf22056f26452eb3f76d7412ae0f62db3b1dfb0b" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.915165 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.924149 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.943753 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:21:10 crc kubenswrapper[4773]: E0120 19:21:10.944244 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="probe" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.945076 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="probe" Jan 20 19:21:10 crc kubenswrapper[4773]: E0120 19:21:10.945120 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="manila-share" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.945146 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="manila-share" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.945388 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="probe" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.945428 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" containerName="manila-share" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.946736 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.952355 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"manila-share-share1-config-data" Jan 20 19:21:10 crc kubenswrapper[4773]: I0120 19:21:10.975452 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.013708 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.013801 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdn68\" (UniqueName: \"kubernetes.io/projected/a7af9581-1520-466a-8b8f-1b957274273e-kube-api-access-fdn68\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.013865 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-scripts\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.013888 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-config-data\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.014063 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.014276 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7af9581-1520-466a-8b8f-1b957274273e-ceph\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.014361 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7af9581-1520-466a-8b8f-1b957274273e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.014383 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a7af9581-1520-466a-8b8f-1b957274273e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116186 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7af9581-1520-466a-8b8f-1b957274273e-ceph\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116269 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7af9581-1520-466a-8b8f-1b957274273e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116290 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a7af9581-1520-466a-8b8f-1b957274273e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116324 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116373 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/a7af9581-1520-466a-8b8f-1b957274273e-etc-machine-id\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116377 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdn68\" (UniqueName: \"kubernetes.io/projected/a7af9581-1520-466a-8b8f-1b957274273e-kube-api-access-fdn68\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116495 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-manila\" (UniqueName: \"kubernetes.io/host-path/a7af9581-1520-466a-8b8f-1b957274273e-var-lib-manila\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116500 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-scripts\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116584 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-config-data\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.116717 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.120298 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-scripts\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.120748 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceph\" (UniqueName: \"kubernetes.io/projected/a7af9581-1520-466a-8b8f-1b957274273e-ceph\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.120767 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-combined-ca-bundle\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.120944 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-config-data-custom\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.121448 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a7af9581-1520-466a-8b8f-1b957274273e-config-data\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.137484 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdn68\" (UniqueName: \"kubernetes.io/projected/a7af9581-1520-466a-8b8f-1b957274273e-kube-api-access-fdn68\") pod \"manila-share-share1-0\" (UID: \"a7af9581-1520-466a-8b8f-1b957274273e\") " pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.270464 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/manila-share-share1-0" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.464871 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03fcee75-e9f9-4ff2-a932-ad3cb7395d7d" path="/var/lib/kubelet/pods/03fcee75-e9f9-4ff2-a932-ad3cb7395d7d/volumes" Jan 20 19:21:11 crc kubenswrapper[4773]: I0120 19:21:11.938329 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/manila-share-share1-0"] Jan 20 19:21:11 crc kubenswrapper[4773]: W0120 19:21:11.939241 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda7af9581_1520_466a_8b8f_1b957274273e.slice/crio-30873d0e8b43e64a4fe9fab35b7ff2527af75b8669c2e7493b0b8c59cbb7d859 WatchSource:0}: Error finding container 30873d0e8b43e64a4fe9fab35b7ff2527af75b8669c2e7493b0b8c59cbb7d859: Status 404 returned error can't find the container with id 30873d0e8b43e64a4fe9fab35b7ff2527af75b8669c2e7493b0b8c59cbb7d859 Jan 20 19:21:12 crc kubenswrapper[4773]: I0120 19:21:12.885784 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a7af9581-1520-466a-8b8f-1b957274273e","Type":"ContainerStarted","Data":"0acff6ef5aff885d7560993655ab871f967d11c930e2fdb7b062c52768a6a2bb"} Jan 20 19:21:12 crc kubenswrapper[4773]: I0120 19:21:12.886338 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a7af9581-1520-466a-8b8f-1b957274273e","Type":"ContainerStarted","Data":"b611c12cc3bce8f76792dd71b28081d98ac51bbfa4e690ca1057e86ed0cce8f4"} Jan 20 19:21:12 crc kubenswrapper[4773]: I0120 19:21:12.886350 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/manila-share-share1-0" event={"ID":"a7af9581-1520-466a-8b8f-1b957274273e","Type":"ContainerStarted","Data":"30873d0e8b43e64a4fe9fab35b7ff2527af75b8669c2e7493b0b8c59cbb7d859"} Jan 20 19:21:12 crc kubenswrapper[4773]: I0120 19:21:12.911330 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/manila-share-share1-0" podStartSLOduration=2.911312954 podStartE2EDuration="2.911312954s" podCreationTimestamp="2026-01-20 19:21:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-20 19:21:12.906700952 +0000 UTC m=+3065.828513976" watchObservedRunningTime="2026-01-20 19:21:12.911312954 +0000 UTC m=+3065.833125978" Jan 20 19:21:15 crc kubenswrapper[4773]: I0120 19:21:15.280044 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-scheduler-0" Jan 20 19:21:21 crc kubenswrapper[4773]: I0120 19:21:21.270849 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/manila-share-share1-0" Jan 20 19:21:21 crc kubenswrapper[4773]: I0120 19:21:21.447038 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:21:21 crc kubenswrapper[4773]: E0120 19:21:21.447452 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:21:26 crc kubenswrapper[4773]: I0120 19:21:26.837706 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-scheduler-0" Jan 20 19:21:32 crc kubenswrapper[4773]: I0120 19:21:32.793620 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/manila-share-share1-0" Jan 20 19:21:35 crc kubenswrapper[4773]: I0120 19:21:35.299231 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Jan 20 19:21:36 crc kubenswrapper[4773]: I0120 19:21:36.447206 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:21:36 crc kubenswrapper[4773]: E0120 19:21:36.447778 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:21:48 crc kubenswrapper[4773]: I0120 19:21:48.447238 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:21:48 crc kubenswrapper[4773]: E0120 19:21:48.448098 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:22:02 crc kubenswrapper[4773]: I0120 19:22:02.447175 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:22:02 crc kubenswrapper[4773]: E0120 19:22:02.448130 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:22:13 crc kubenswrapper[4773]: I0120 19:22:13.448296 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:22:13 crc kubenswrapper[4773]: E0120 19:22:13.449296 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:22:13 crc kubenswrapper[4773]: E0120 19:22:13.739396 4773 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.39:33062->38.102.83.39:34695: write tcp 38.102.83.39:33062->38.102.83.39:34695: write: broken pipe Jan 20 19:22:24 crc kubenswrapper[4773]: I0120 19:22:24.447494 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:22:24 crc kubenswrapper[4773]: E0120 19:22:24.448493 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:22:38 crc kubenswrapper[4773]: I0120 19:22:38.448823 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:22:38 crc kubenswrapper[4773]: E0120 19:22:38.449529 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:22:52 crc kubenswrapper[4773]: I0120 19:22:52.447346 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:22:52 crc kubenswrapper[4773]: E0120 19:22:52.448128 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:23:03 crc kubenswrapper[4773]: I0120 19:23:03.449179 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:23:03 crc kubenswrapper[4773]: E0120 19:23:03.449980 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:23:16 crc kubenswrapper[4773]: I0120 19:23:16.447480 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:23:16 crc kubenswrapper[4773]: E0120 19:23:16.448174 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:23:30 crc kubenswrapper[4773]: I0120 19:23:30.447433 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:23:30 crc kubenswrapper[4773]: E0120 19:23:30.448640 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:23:44 crc kubenswrapper[4773]: I0120 19:23:44.447581 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:23:44 crc kubenswrapper[4773]: E0120 19:23:44.448397 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:23:56 crc kubenswrapper[4773]: I0120 19:23:56.447005 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:23:56 crc kubenswrapper[4773]: E0120 19:23:56.447993 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:24:11 crc kubenswrapper[4773]: I0120 19:24:11.448377 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:24:12 crc kubenswrapper[4773]: I0120 19:24:12.449634 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"f6b3b5c728dd60bac57f1335c6421f37527b148c607e8336779f2aa1fd55ae02"} Jan 20 19:25:06 crc kubenswrapper[4773]: I0120 19:25:06.919890 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rxnc7"] Jan 20 19:25:06 crc kubenswrapper[4773]: I0120 19:25:06.929235 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:06 crc kubenswrapper[4773]: I0120 19:25:06.936078 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rxnc7"] Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.034032 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-catalog-content\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.034082 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dlp4\" (UniqueName: \"kubernetes.io/projected/840e8c66-c1a1-423a-ab61-21697ce5f35d-kube-api-access-6dlp4\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.034132 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-utilities\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.137260 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-catalog-content\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.137389 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6dlp4\" (UniqueName: \"kubernetes.io/projected/840e8c66-c1a1-423a-ab61-21697ce5f35d-kube-api-access-6dlp4\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.137481 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-utilities\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.137769 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-catalog-content\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.138290 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-utilities\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.158951 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dlp4\" (UniqueName: \"kubernetes.io/projected/840e8c66-c1a1-423a-ab61-21697ce5f35d-kube-api-access-6dlp4\") pod \"redhat-operators-rxnc7\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.267028 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.798429 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rxnc7"] Jan 20 19:25:07 crc kubenswrapper[4773]: I0120 19:25:07.937945 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxnc7" event={"ID":"840e8c66-c1a1-423a-ab61-21697ce5f35d","Type":"ContainerStarted","Data":"885ff945d26cc2c0aa231b3527368210271a137dbd07e6bd6c660b02a19cd240"} Jan 20 19:25:08 crc kubenswrapper[4773]: I0120 19:25:08.950461 4773 generic.go:334] "Generic (PLEG): container finished" podID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerID="8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53" exitCode=0 Jan 20 19:25:08 crc kubenswrapper[4773]: I0120 19:25:08.950509 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxnc7" event={"ID":"840e8c66-c1a1-423a-ab61-21697ce5f35d","Type":"ContainerDied","Data":"8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53"} Jan 20 19:25:10 crc kubenswrapper[4773]: I0120 19:25:10.984467 4773 generic.go:334] "Generic (PLEG): container finished" podID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerID="b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac" exitCode=0 Jan 20 19:25:10 crc kubenswrapper[4773]: I0120 19:25:10.984530 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxnc7" event={"ID":"840e8c66-c1a1-423a-ab61-21697ce5f35d","Type":"ContainerDied","Data":"b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac"} Jan 20 19:25:13 crc kubenswrapper[4773]: I0120 19:25:13.003009 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxnc7" event={"ID":"840e8c66-c1a1-423a-ab61-21697ce5f35d","Type":"ContainerStarted","Data":"c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747"} Jan 20 19:25:13 crc kubenswrapper[4773]: I0120 19:25:13.026536 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rxnc7" podStartSLOduration=4.168866828 podStartE2EDuration="7.026514266s" podCreationTimestamp="2026-01-20 19:25:06 +0000 UTC" firstStartedPulling="2026-01-20 19:25:08.953979269 +0000 UTC m=+3301.875792293" lastFinishedPulling="2026-01-20 19:25:11.811626697 +0000 UTC m=+3304.733439731" observedRunningTime="2026-01-20 19:25:13.020797386 +0000 UTC m=+3305.942610410" watchObservedRunningTime="2026-01-20 19:25:13.026514266 +0000 UTC m=+3305.948327290" Jan 20 19:25:17 crc kubenswrapper[4773]: I0120 19:25:17.269196 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:17 crc kubenswrapper[4773]: I0120 19:25:17.269816 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:17 crc kubenswrapper[4773]: I0120 19:25:17.320479 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:18 crc kubenswrapper[4773]: I0120 19:25:18.089318 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:18 crc kubenswrapper[4773]: I0120 19:25:18.140245 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rxnc7"] Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.060018 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rxnc7" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="registry-server" containerID="cri-o://c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747" gracePeriod=2 Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.494415 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.614917 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-utilities\") pod \"840e8c66-c1a1-423a-ab61-21697ce5f35d\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.615067 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6dlp4\" (UniqueName: \"kubernetes.io/projected/840e8c66-c1a1-423a-ab61-21697ce5f35d-kube-api-access-6dlp4\") pod \"840e8c66-c1a1-423a-ab61-21697ce5f35d\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.615153 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-catalog-content\") pod \"840e8c66-c1a1-423a-ab61-21697ce5f35d\" (UID: \"840e8c66-c1a1-423a-ab61-21697ce5f35d\") " Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.615777 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-utilities" (OuterVolumeSpecName: "utilities") pod "840e8c66-c1a1-423a-ab61-21697ce5f35d" (UID: "840e8c66-c1a1-423a-ab61-21697ce5f35d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.627085 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/840e8c66-c1a1-423a-ab61-21697ce5f35d-kube-api-access-6dlp4" (OuterVolumeSpecName: "kube-api-access-6dlp4") pod "840e8c66-c1a1-423a-ab61-21697ce5f35d" (UID: "840e8c66-c1a1-423a-ab61-21697ce5f35d"). InnerVolumeSpecName "kube-api-access-6dlp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.718068 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.718105 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6dlp4\" (UniqueName: \"kubernetes.io/projected/840e8c66-c1a1-423a-ab61-21697ce5f35d-kube-api-access-6dlp4\") on node \"crc\" DevicePath \"\"" Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.726709 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "840e8c66-c1a1-423a-ab61-21697ce5f35d" (UID: "840e8c66-c1a1-423a-ab61-21697ce5f35d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:25:20 crc kubenswrapper[4773]: I0120 19:25:20.820124 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/840e8c66-c1a1-423a-ab61-21697ce5f35d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.080500 4773 generic.go:334] "Generic (PLEG): container finished" podID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerID="c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747" exitCode=0 Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.080820 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxnc7" event={"ID":"840e8c66-c1a1-423a-ab61-21697ce5f35d","Type":"ContainerDied","Data":"c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747"} Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.080851 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rxnc7" event={"ID":"840e8c66-c1a1-423a-ab61-21697ce5f35d","Type":"ContainerDied","Data":"885ff945d26cc2c0aa231b3527368210271a137dbd07e6bd6c660b02a19cd240"} Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.080872 4773 scope.go:117] "RemoveContainer" containerID="c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.081059 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rxnc7" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.113223 4773 scope.go:117] "RemoveContainer" containerID="b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.123081 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rxnc7"] Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.130652 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rxnc7"] Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.154045 4773 scope.go:117] "RemoveContainer" containerID="8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.189492 4773 scope.go:117] "RemoveContainer" containerID="c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747" Jan 20 19:25:21 crc kubenswrapper[4773]: E0120 19:25:21.189911 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747\": container with ID starting with c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747 not found: ID does not exist" containerID="c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.190182 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747"} err="failed to get container status \"c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747\": rpc error: code = NotFound desc = could not find container \"c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747\": container with ID starting with c17ecdb331c24693127e30d7411d2d873a336926f6fe25c8242b5f2f3391a747 not found: ID does not exist" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.190213 4773 scope.go:117] "RemoveContainer" containerID="b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac" Jan 20 19:25:21 crc kubenswrapper[4773]: E0120 19:25:21.190465 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac\": container with ID starting with b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac not found: ID does not exist" containerID="b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.190488 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac"} err="failed to get container status \"b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac\": rpc error: code = NotFound desc = could not find container \"b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac\": container with ID starting with b119796d14651dc99fb6a589eb578dd045c0accf3b41bb46948211bef1247dac not found: ID does not exist" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.190502 4773 scope.go:117] "RemoveContainer" containerID="8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53" Jan 20 19:25:21 crc kubenswrapper[4773]: E0120 19:25:21.191461 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53\": container with ID starting with 8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53 not found: ID does not exist" containerID="8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.191525 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53"} err="failed to get container status \"8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53\": rpc error: code = NotFound desc = could not find container \"8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53\": container with ID starting with 8b445389300fb6e6f6eaf6e9f1889a8f674e09be99a83d1e3b3331f2afd85e53 not found: ID does not exist" Jan 20 19:25:21 crc kubenswrapper[4773]: I0120 19:25:21.462630 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" path="/var/lib/kubelet/pods/840e8c66-c1a1-423a-ab61-21697ce5f35d/volumes" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.782260 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wp4jg"] Jan 20 19:26:18 crc kubenswrapper[4773]: E0120 19:26:18.783301 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="extract-content" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.783318 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="extract-content" Jan 20 19:26:18 crc kubenswrapper[4773]: E0120 19:26:18.783345 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="extract-utilities" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.783355 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="extract-utilities" Jan 20 19:26:18 crc kubenswrapper[4773]: E0120 19:26:18.783373 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="registry-server" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.783383 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="registry-server" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.783656 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="840e8c66-c1a1-423a-ab61-21697ce5f35d" containerName="registry-server" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.785238 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.792877 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp4jg"] Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.945339 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtvsq\" (UniqueName: \"kubernetes.io/projected/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-kube-api-access-mtvsq\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.945584 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-utilities\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:18 crc kubenswrapper[4773]: I0120 19:26:18.945748 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-catalog-content\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.048233 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtvsq\" (UniqueName: \"kubernetes.io/projected/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-kube-api-access-mtvsq\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.048333 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-utilities\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.048394 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-catalog-content\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.048888 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-utilities\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.048907 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-catalog-content\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.069288 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtvsq\" (UniqueName: \"kubernetes.io/projected/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-kube-api-access-mtvsq\") pod \"redhat-marketplace-wp4jg\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.116525 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:19 crc kubenswrapper[4773]: I0120 19:26:19.599354 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp4jg"] Jan 20 19:26:20 crc kubenswrapper[4773]: I0120 19:26:20.577153 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerID="4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763" exitCode=0 Jan 20 19:26:20 crc kubenswrapper[4773]: I0120 19:26:20.577243 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp4jg" event={"ID":"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2","Type":"ContainerDied","Data":"4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763"} Jan 20 19:26:20 crc kubenswrapper[4773]: I0120 19:26:20.577463 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp4jg" event={"ID":"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2","Type":"ContainerStarted","Data":"a6a22ad3e980bdfc6cc2b08d94bf77888d412aa3bd57fac6d02fb9d83ab48b57"} Jan 20 19:26:20 crc kubenswrapper[4773]: I0120 19:26:20.578978 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 19:26:22 crc kubenswrapper[4773]: I0120 19:26:22.597538 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerID="c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23" exitCode=0 Jan 20 19:26:22 crc kubenswrapper[4773]: I0120 19:26:22.597766 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp4jg" event={"ID":"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2","Type":"ContainerDied","Data":"c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23"} Jan 20 19:26:23 crc kubenswrapper[4773]: I0120 19:26:23.611313 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp4jg" event={"ID":"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2","Type":"ContainerStarted","Data":"c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470"} Jan 20 19:26:23 crc kubenswrapper[4773]: I0120 19:26:23.632861 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wp4jg" podStartSLOduration=3.228896666 podStartE2EDuration="5.632842673s" podCreationTimestamp="2026-01-20 19:26:18 +0000 UTC" firstStartedPulling="2026-01-20 19:26:20.57871681 +0000 UTC m=+3373.500529844" lastFinishedPulling="2026-01-20 19:26:22.982662827 +0000 UTC m=+3375.904475851" observedRunningTime="2026-01-20 19:26:23.626760474 +0000 UTC m=+3376.548573498" watchObservedRunningTime="2026-01-20 19:26:23.632842673 +0000 UTC m=+3376.554655697" Jan 20 19:26:28 crc kubenswrapper[4773]: I0120 19:26:28.169706 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:26:28 crc kubenswrapper[4773]: I0120 19:26:28.170279 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:26:29 crc kubenswrapper[4773]: I0120 19:26:29.116673 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:29 crc kubenswrapper[4773]: I0120 19:26:29.117030 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:29 crc kubenswrapper[4773]: I0120 19:26:29.176555 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:29 crc kubenswrapper[4773]: I0120 19:26:29.712043 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:29 crc kubenswrapper[4773]: I0120 19:26:29.758445 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp4jg"] Jan 20 19:26:31 crc kubenswrapper[4773]: I0120 19:26:31.685201 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wp4jg" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="registry-server" containerID="cri-o://c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470" gracePeriod=2 Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.176568 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.295141 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mtvsq\" (UniqueName: \"kubernetes.io/projected/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-kube-api-access-mtvsq\") pod \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.295317 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-utilities\") pod \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.295370 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-catalog-content\") pod \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\" (UID: \"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2\") " Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.296162 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-utilities" (OuterVolumeSpecName: "utilities") pod "b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" (UID: "b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.300200 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-kube-api-access-mtvsq" (OuterVolumeSpecName: "kube-api-access-mtvsq") pod "b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" (UID: "b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2"). InnerVolumeSpecName "kube-api-access-mtvsq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.323446 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" (UID: "b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.397780 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.397822 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mtvsq\" (UniqueName: \"kubernetes.io/projected/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-kube-api-access-mtvsq\") on node \"crc\" DevicePath \"\"" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.397836 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.696790 4773 generic.go:334] "Generic (PLEG): container finished" podID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerID="c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470" exitCode=0 Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.696882 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wp4jg" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.696887 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp4jg" event={"ID":"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2","Type":"ContainerDied","Data":"c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470"} Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.698023 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wp4jg" event={"ID":"b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2","Type":"ContainerDied","Data":"a6a22ad3e980bdfc6cc2b08d94bf77888d412aa3bd57fac6d02fb9d83ab48b57"} Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.698070 4773 scope.go:117] "RemoveContainer" containerID="c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.738236 4773 scope.go:117] "RemoveContainer" containerID="c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.768903 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp4jg"] Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.777817 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wp4jg"] Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.784224 4773 scope.go:117] "RemoveContainer" containerID="4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.817048 4773 scope.go:117] "RemoveContainer" containerID="c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470" Jan 20 19:26:32 crc kubenswrapper[4773]: E0120 19:26:32.817543 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470\": container with ID starting with c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470 not found: ID does not exist" containerID="c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.817577 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470"} err="failed to get container status \"c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470\": rpc error: code = NotFound desc = could not find container \"c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470\": container with ID starting with c8314f3c93b3b45f398fb80c123510da293689d282caf870378f4cc094361470 not found: ID does not exist" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.817604 4773 scope.go:117] "RemoveContainer" containerID="c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23" Jan 20 19:26:32 crc kubenswrapper[4773]: E0120 19:26:32.818077 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23\": container with ID starting with c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23 not found: ID does not exist" containerID="c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.818127 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23"} err="failed to get container status \"c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23\": rpc error: code = NotFound desc = could not find container \"c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23\": container with ID starting with c0cbd1c89aebb12b144202dda59867410c9eaa93e2a817823537ebbdade0ac23 not found: ID does not exist" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.818159 4773 scope.go:117] "RemoveContainer" containerID="4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763" Jan 20 19:26:32 crc kubenswrapper[4773]: E0120 19:26:32.819212 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763\": container with ID starting with 4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763 not found: ID does not exist" containerID="4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763" Jan 20 19:26:32 crc kubenswrapper[4773]: I0120 19:26:32.819268 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763"} err="failed to get container status \"4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763\": rpc error: code = NotFound desc = could not find container \"4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763\": container with ID starting with 4658ca605a30ceb57bcd235ddfeb6bcf121a777170e7fe82beb0264d99e3a763 not found: ID does not exist" Jan 20 19:26:33 crc kubenswrapper[4773]: I0120 19:26:33.459904 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" path="/var/lib/kubelet/pods/b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2/volumes" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.545383 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mzdmf/must-gather-lp22t"] Jan 20 19:26:48 crc kubenswrapper[4773]: E0120 19:26:48.546278 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="extract-utilities" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.546291 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="extract-utilities" Jan 20 19:26:48 crc kubenswrapper[4773]: E0120 19:26:48.546301 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="registry-server" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.546307 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="registry-server" Jan 20 19:26:48 crc kubenswrapper[4773]: E0120 19:26:48.546338 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="extract-content" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.546345 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="extract-content" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.546504 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2ea5cf0-e620-4fc2-a3e7-a1efd5af40c2" containerName="registry-server" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.547588 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.550241 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mzdmf"/"openshift-service-ca.crt" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.550465 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-mzdmf"/"kube-root-ca.crt" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.557585 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae725e5b-de4d-443b-bd8c-985abdcb0f87-must-gather-output\") pod \"must-gather-lp22t\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.557686 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khbtp\" (UniqueName: \"kubernetes.io/projected/ae725e5b-de4d-443b-bd8c-985abdcb0f87-kube-api-access-khbtp\") pod \"must-gather-lp22t\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.567499 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mzdmf/must-gather-lp22t"] Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.658957 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae725e5b-de4d-443b-bd8c-985abdcb0f87-must-gather-output\") pod \"must-gather-lp22t\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.659162 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khbtp\" (UniqueName: \"kubernetes.io/projected/ae725e5b-de4d-443b-bd8c-985abdcb0f87-kube-api-access-khbtp\") pod \"must-gather-lp22t\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.659583 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae725e5b-de4d-443b-bd8c-985abdcb0f87-must-gather-output\") pod \"must-gather-lp22t\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.688766 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khbtp\" (UniqueName: \"kubernetes.io/projected/ae725e5b-de4d-443b-bd8c-985abdcb0f87-kube-api-access-khbtp\") pod \"must-gather-lp22t\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:48 crc kubenswrapper[4773]: I0120 19:26:48.867334 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:26:49 crc kubenswrapper[4773]: I0120 19:26:49.312654 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-mzdmf/must-gather-lp22t"] Jan 20 19:26:49 crc kubenswrapper[4773]: W0120 19:26:49.321912 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae725e5b_de4d_443b_bd8c_985abdcb0f87.slice/crio-f2e1c86fbace66eb94c7e2e19ec6d66c0150495f0612506111d8ac4a49eb63e9 WatchSource:0}: Error finding container f2e1c86fbace66eb94c7e2e19ec6d66c0150495f0612506111d8ac4a49eb63e9: Status 404 returned error can't find the container with id f2e1c86fbace66eb94c7e2e19ec6d66c0150495f0612506111d8ac4a49eb63e9 Jan 20 19:26:49 crc kubenswrapper[4773]: I0120 19:26:49.843875 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/must-gather-lp22t" event={"ID":"ae725e5b-de4d-443b-bd8c-985abdcb0f87","Type":"ContainerStarted","Data":"f2e1c86fbace66eb94c7e2e19ec6d66c0150495f0612506111d8ac4a49eb63e9"} Jan 20 19:26:56 crc kubenswrapper[4773]: I0120 19:26:56.926393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/must-gather-lp22t" event={"ID":"ae725e5b-de4d-443b-bd8c-985abdcb0f87","Type":"ContainerStarted","Data":"2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124"} Jan 20 19:26:57 crc kubenswrapper[4773]: I0120 19:26:57.939834 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/must-gather-lp22t" event={"ID":"ae725e5b-de4d-443b-bd8c-985abdcb0f87","Type":"ContainerStarted","Data":"1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95"} Jan 20 19:26:57 crc kubenswrapper[4773]: I0120 19:26:57.957474 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mzdmf/must-gather-lp22t" podStartSLOduration=2.642644924 podStartE2EDuration="9.95745883s" podCreationTimestamp="2026-01-20 19:26:48 +0000 UTC" firstStartedPulling="2026-01-20 19:26:49.325011889 +0000 UTC m=+3402.246824913" lastFinishedPulling="2026-01-20 19:26:56.639825795 +0000 UTC m=+3409.561638819" observedRunningTime="2026-01-20 19:26:57.956381733 +0000 UTC m=+3410.878194757" watchObservedRunningTime="2026-01-20 19:26:57.95745883 +0000 UTC m=+3410.879271854" Jan 20 19:26:58 crc kubenswrapper[4773]: I0120 19:26:58.170248 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:26:58 crc kubenswrapper[4773]: I0120 19:26:58.170296 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:27:00 crc kubenswrapper[4773]: I0120 19:27:00.890463 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mzdmf/crc-debug-txz6z"] Jan 20 19:27:00 crc kubenswrapper[4773]: I0120 19:27:00.892180 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:00 crc kubenswrapper[4773]: I0120 19:27:00.894447 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mzdmf"/"default-dockercfg-6z6zw" Jan 20 19:27:00 crc kubenswrapper[4773]: I0120 19:27:00.943119 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630b0da4-d7f7-4f6e-8489-66087b5b8974-host\") pod \"crc-debug-txz6z\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:00 crc kubenswrapper[4773]: I0120 19:27:00.943187 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5cm7\" (UniqueName: \"kubernetes.io/projected/630b0da4-d7f7-4f6e-8489-66087b5b8974-kube-api-access-j5cm7\") pod \"crc-debug-txz6z\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:01 crc kubenswrapper[4773]: I0120 19:27:01.045928 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630b0da4-d7f7-4f6e-8489-66087b5b8974-host\") pod \"crc-debug-txz6z\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:01 crc kubenswrapper[4773]: I0120 19:27:01.046016 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5cm7\" (UniqueName: \"kubernetes.io/projected/630b0da4-d7f7-4f6e-8489-66087b5b8974-kube-api-access-j5cm7\") pod \"crc-debug-txz6z\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:01 crc kubenswrapper[4773]: I0120 19:27:01.046073 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630b0da4-d7f7-4f6e-8489-66087b5b8974-host\") pod \"crc-debug-txz6z\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:01 crc kubenswrapper[4773]: I0120 19:27:01.065583 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5cm7\" (UniqueName: \"kubernetes.io/projected/630b0da4-d7f7-4f6e-8489-66087b5b8974-kube-api-access-j5cm7\") pod \"crc-debug-txz6z\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:01 crc kubenswrapper[4773]: I0120 19:27:01.213666 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:01 crc kubenswrapper[4773]: W0120 19:27:01.251753 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod630b0da4_d7f7_4f6e_8489_66087b5b8974.slice/crio-f386fe57ce0e4edf2c60cf0dce4de5d1e38e9de41ab63c8c3982abcc44b3b407 WatchSource:0}: Error finding container f386fe57ce0e4edf2c60cf0dce4de5d1e38e9de41ab63c8c3982abcc44b3b407: Status 404 returned error can't find the container with id f386fe57ce0e4edf2c60cf0dce4de5d1e38e9de41ab63c8c3982abcc44b3b407 Jan 20 19:27:01 crc kubenswrapper[4773]: I0120 19:27:01.975246 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" event={"ID":"630b0da4-d7f7-4f6e-8489-66087b5b8974","Type":"ContainerStarted","Data":"f386fe57ce0e4edf2c60cf0dce4de5d1e38e9de41ab63c8c3982abcc44b3b407"} Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.316616 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7574cb8f94-wwkgd_436dcd32-51a0-4a9e-8a0a-fb852a5de1f0/barbican-api-log/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.328050 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-7574cb8f94-wwkgd_436dcd32-51a0-4a9e-8a0a-fb852a5de1f0/barbican-api/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.352247 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7854d7cd94-r9cm7_8839acb4-5db9-4b47-a075-8798d8a01c6b/barbican-keystone-listener-log/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.358305 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-7854d7cd94-r9cm7_8839acb4-5db9-4b47-a075-8798d8a01c6b/barbican-keystone-listener/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.378476 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69f4d99ff7-gmlhl_52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782/barbican-worker-log/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.387783 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-69f4d99ff7-gmlhl_52aa4dd5-c8c1-4de9-b7e1-9c17fb6ae782/barbican-worker/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.457401 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-9xjh8_586f1b07-ae25-4acf-8a65-92377c4db234/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.486455 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6165543b-5cc4-4a1c-bbba-ed4621838073/ceilometer-central-agent/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.508500 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6165543b-5cc4-4a1c-bbba-ed4621838073/ceilometer-notification-agent/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.516924 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6165543b-5cc4-4a1c-bbba-ed4621838073/sg-core/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.523934 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_6165543b-5cc4-4a1c-bbba-ed4621838073/proxy-httpd/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.541364 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-client-edpm-deployment-openstack-edpm-ipam-v2ffx_2cefaa80-8ba4-4e73-81e3-927c47cc2a5d/ceph-client-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.565330 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceph-hci-pre-edpm-deployment-openstack-edpm-ipam-jdhj7_e1492d77-23f5-4ed0-9511-e5b4ee1107c7/ceph-hci-pre-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.590973 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d4d69bee-fde2-4fb6-95f6-74e35b8d5db5/cinder-api-log/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.633894 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_d4d69bee-fde2-4fb6-95f6-74e35b8d5db5/cinder-api/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.804339 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_4a053127-e129-429c-9a7b-28e084c34269/cinder-backup/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.814584 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-backup-0_4a053127-e129-429c-9a7b-28e084c34269/probe/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.854972 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e3e6b840-22c8-4add-b022-1ba197ca588c/cinder-scheduler/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.881881 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_e3e6b840-22c8-4add-b022-1ba197ca588c/probe/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.947452 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_466685e0-d49e-4d97-9436-7db7c10062c3/cinder-volume/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.962018 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-volume-volume1-0_466685e0-d49e-4d97-9436-7db7c10062c3/probe/0.log" Jan 20 19:27:03 crc kubenswrapper[4773]: I0120 19:27:03.987727 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-5t9qm_a290d892-d26b-4f1c-b4a0-9778e6b58c7b/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.055035 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-dr6r5_b3ce8585-331b-44ef-b8f8-aa5cb3b96589/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.074911 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-d8qrf_3b08b301-686b-45e6-9903-5df8a754a16a/dnsmasq-dns/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.080705 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-76b5fdb995-d8qrf_3b08b301-686b-45e6-9903-5df8a754a16a/init/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.097350 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_696e3ee3-25fa-4102-b483-1781d00bb18f/glance-log/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.112831 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_696e3ee3-25fa-4102-b483-1781d00bb18f/glance-httpd/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.133684 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_117c4f3b-d438-4f73-966c-378c28f67460/glance-log/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.145898 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_117c4f3b-d438-4f73-966c-378c28f67460/glance-httpd/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.450929 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-68fb89f56b-287lx_cd9ba14c-8dca-4170-841c-6f5d5fa2b220/horizon-log/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.530955 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-68fb89f56b-287lx_cd9ba14c-8dca-4170-841c-6f5d5fa2b220/horizon/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.549876 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-lshfv_a459169f-671f-4dd7-96d3-019d59bd14c6/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.573043 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-h7lxl_00dc0471-09f0-4cdf-a237-aba1d232cf04/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.695163 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-5bdd8cdbd7-xhf92_03658323-86f4-42ec-b18f-163a1e7dcaed/keystone-api/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.703679 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29482261-dqww9_5b0951c0-055b-44bd-a686-9a4938af6b4f/keystone-cron/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.715851 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_7e2f1ada-ddef-454d-bdb7-fd695ee8f4ea/kube-state-metrics/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.756457 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-2vbh9_5e1b8272-3f37-405c-9f7c-acc1dd855d60/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.770573 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_6e180830-62c9-4473-9d6b-197fbe92af49/manila-api-log/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.847808 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-api-0_6e180830-62c9-4473-9d6b-197fbe92af49/manila-api/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.862000 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-c4ba-account-create-update-47gmh_80285eae-2998-47ab-bcd6-e9905e2e71d4/mariadb-account-create-update/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.874840 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-create-5vk9g_33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6/mariadb-database-create/0.log" Jan 20 19:27:04 crc kubenswrapper[4773]: I0120 19:27:04.900040 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-db-sync-r2zvz_32b245ce-84e1-4fbc-adef-ebfdd1e88d77/manila-db-sync/0.log" Jan 20 19:27:05 crc kubenswrapper[4773]: I0120 19:27:05.006147 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e42acb5b-abbc-4f06-918d-2e886b50146e/manila-scheduler/0.log" Jan 20 19:27:05 crc kubenswrapper[4773]: I0120 19:27:05.011389 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-scheduler-0_e42acb5b-abbc-4f06-918d-2e886b50146e/probe/0.log" Jan 20 19:27:05 crc kubenswrapper[4773]: I0120 19:27:05.052365 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a7af9581-1520-466a-8b8f-1b957274273e/manila-share/0.log" Jan 20 19:27:05 crc kubenswrapper[4773]: I0120 19:27:05.072923 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_manila-share-share1-0_a7af9581-1520-466a-8b8f-1b957274273e/probe/0.log" Jan 20 19:27:16 crc kubenswrapper[4773]: I0120 19:27:16.211619 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" event={"ID":"630b0da4-d7f7-4f6e-8489-66087b5b8974","Type":"ContainerStarted","Data":"850d3bec4e1a476b5a9d345935605675d1c6c1592f339240f8819f6c39afef82"} Jan 20 19:27:16 crc kubenswrapper[4773]: I0120 19:27:16.233725 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" podStartSLOduration=1.775508852 podStartE2EDuration="16.233701505s" podCreationTimestamp="2026-01-20 19:27:00 +0000 UTC" firstStartedPulling="2026-01-20 19:27:01.254016958 +0000 UTC m=+3414.175829982" lastFinishedPulling="2026-01-20 19:27:15.712209611 +0000 UTC m=+3428.634022635" observedRunningTime="2026-01-20 19:27:16.227167155 +0000 UTC m=+3429.148980179" watchObservedRunningTime="2026-01-20 19:27:16.233701505 +0000 UTC m=+3429.155514529" Jan 20 19:27:16 crc kubenswrapper[4773]: I0120 19:27:16.511673 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_cb8cda87-65c5-4be7-9891-b82bcfc8e0d4/memcached/0.log" Jan 20 19:27:16 crc kubenswrapper[4773]: I0120 19:27:16.555639 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76cffc5d9-m6wn7_f98c94f3-5e79-4d1a-9e1f-bab68689f193/neutron-api/0.log" Jan 20 19:27:16 crc kubenswrapper[4773]: I0120 19:27:16.571101 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-76cffc5d9-m6wn7_f98c94f3-5e79-4d1a-9e1f-bab68689f193/neutron-httpd/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.072651 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-llwxc_f735fea9-67a7-4dcc-96f9-8e852df016ce/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.160705 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f890481e-0c9f-4194-8af3-d808bb105995/nova-api-log/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.375087 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_f890481e-0c9f-4194-8af3-d808bb105995/nova-api-api/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.474049 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_b123d99d-6cf6-4516-a5ae-7dcdf8262269/nova-cell0-conductor-conductor/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.586810 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_7970e552-0aac-436b-ba20-4810e82dcd20/nova-cell1-conductor-conductor/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.662979 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_7c5b56a3-1c91-4347-ae44-63f05c35e134/nova-cell1-novncproxy-novncproxy/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.714617 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-custom-ceph-edpm-deployment-openstack-edpm-ipam-8l96d_e7bfe1d6-9e6c-4964-9cdf-2204156f14c6/nova-custom-ceph-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:17 crc kubenswrapper[4773]: I0120 19:27:17.782188 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ceeec9e1-d0f5-497c-b262-2ef81be261ee/nova-metadata-log/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.511887 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_ceeec9e1-d0f5-497c-b262-2ef81be261ee/nova-metadata-metadata/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.602897 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_8413ef33-749f-4413-9965-fd19ad70ebfc/nova-scheduler-scheduler/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.629901 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bfe9133c-0d58-4877-97ee-5b0abeee1a95/galera/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.642450 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_bfe9133c-0d58-4877-97ee-5b0abeee1a95/mysql-bootstrap/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.685663 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_11b243ca-6da3-4247-a1fe-2ea3e5be80cc/galera/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.705552 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_11b243ca-6da3-4247-a1fe-2ea3e5be80cc/mysql-bootstrap/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.716546 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_f040c75f-a2cb-4bfe-9fd1-0105887fa6b4/openstackclient/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.735363 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-xs9zd_a5ceb1c5-1dbc-4810-95c9-c1ac0b915542/openstack-network-exporter/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.756642 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5gcvm_bada64ed-c7da-4bd9-9195-75bbdcdd0406/ovsdb-server/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.768303 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5gcvm_bada64ed-c7da-4bd9-9195-75bbdcdd0406/ovs-vswitchd/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.775228 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-5gcvm_bada64ed-c7da-4bd9-9195-75bbdcdd0406/ovsdb-server-init/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.786943 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-t5h8j_2fce4eb9-f614-4050-a099-0a743695dcd9/ovn-controller/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.820533 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-cjrtn_de805082-3188-4adb-9607-4ec5535de661/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.830903 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_152ecb39-d580-4c8d-b572-e3a6bb070c7f/ovn-northd/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.838905 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_152ecb39-d580-4c8d-b572-e3a6bb070c7f/openstack-network-exporter/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.853383 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5818e5c4-9a2c-453f-b158-f4be5ec40619/ovsdbserver-nb/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.862605 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_5818e5c4-9a2c-453f-b158-f4be5ec40619/openstack-network-exporter/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.888968 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4c900f03-61d3-470c-9803-3f6b617ddf0a/ovsdbserver-sb/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.896044 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_4c900f03-61d3-470c-9803-3f6b617ddf0a/openstack-network-exporter/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.945968 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-668885694d-2br7g_a7bad355-1a37-4372-9751-25a39f6a3410/placement-log/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.977078 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-668885694d-2br7g_a7bad355-1a37-4372-9751-25a39f6a3410/placement-api/0.log" Jan 20 19:27:18 crc kubenswrapper[4773]: I0120 19:27:18.997965 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_35926f65-848d-4db5-b50a-deef510ce4be/rabbitmq/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.003916 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_35926f65-848d-4db5-b50a-deef510ce4be/setup-container/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.033688 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_375735e1-5d2a-4cc8-892b-4bdcdf9f1e42/rabbitmq/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.038665 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_375735e1-5d2a-4cc8-892b-4bdcdf9f1e42/setup-container/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.057967 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-v4pst_f6ebb133-0720-46a2-9da3-ec9dc396266b/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.074412 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-sgr8f_2ce9c199-1f55-4aea-82f7-5df21339c927/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.090169 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-ppqxn_617e6a58-e676-42e3-a897-939d9072d030/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.104511 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-ll277_08ee5bdf-bc91-4f34-8459-bc65419f93d7/ssh-known-hosts-edpm-deployment/0.log" Jan 20 19:27:19 crc kubenswrapper[4773]: I0120 19:27:19.118200 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-k7bxl_9114481d-74c0-4af1-9bed-3f592f2c102f/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Jan 20 19:27:24 crc kubenswrapper[4773]: I0120 19:27:24.870629 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2nprh_e284563a-e5f1-4c86-8100-863f86a7f7dc/controller/0.log" Jan 20 19:27:24 crc kubenswrapper[4773]: I0120 19:27:24.878016 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2nprh_e284563a-e5f1-4c86-8100-863f86a7f7dc/kube-rbac-proxy/0.log" Jan 20 19:27:24 crc kubenswrapper[4773]: I0120 19:27:24.900358 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/controller/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.596568 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/frr/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.604375 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/reloader/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.611020 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/frr-metrics/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.620555 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/kube-rbac-proxy/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.626703 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/kube-rbac-proxy-frr/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.635993 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-frr-files/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.643385 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-reloader/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.651390 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-metrics/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.666434 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-vv9kz_05a83b70-ac51-4951-92c6-0f90265f2958/frr-k8s-webhook-server/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.692134 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58b89bff97-7jrvm_a93bbf26-2683-4cf0-a45a-1639d6da4e01/manager/0.log" Jan 20 19:27:26 crc kubenswrapper[4773]: I0120 19:27:26.701193 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6cbdbfd488-hxn9x_cbba9cb2-22ec-4f8c-8550-f3a69901785c/webhook-server/0.log" Jan 20 19:27:27 crc kubenswrapper[4773]: I0120 19:27:27.042876 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xdr9_ce74c2a6-61b7-4fb4-883f-e86bf4b5c604/speaker/0.log" Jan 20 19:27:27 crc kubenswrapper[4773]: I0120 19:27:27.048813 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xdr9_ce74c2a6-61b7-4fb4-883f-e86bf4b5c604/kube-rbac-proxy/0.log" Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.170666 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.171115 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.171187 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.172277 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f6b3b5c728dd60bac57f1335c6421f37527b148c607e8336779f2aa1fd55ae02"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.172387 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://f6b3b5c728dd60bac57f1335c6421f37527b148c607e8336779f2aa1fd55ae02" gracePeriod=600 Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.305735 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="f6b3b5c728dd60bac57f1335c6421f37527b148c607e8336779f2aa1fd55ae02" exitCode=0 Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.305780 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"f6b3b5c728dd60bac57f1335c6421f37527b148c607e8336779f2aa1fd55ae02"} Jan 20 19:27:28 crc kubenswrapper[4773]: I0120 19:27:28.305819 4773 scope.go:117] "RemoveContainer" containerID="11935bc20d3c37ba36a8cfe2c5234ee70ed45d973936da23fb9892ece635d019" Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.316393 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d"} Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.655845 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/extract/0.log" Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.673116 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/util/0.log" Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.682597 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/pull/0.log" Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.856814 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-hhxlp_df2d6d5b-b964-4672-903f-563b7792ee43/manager/0.log" Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.909836 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-xmljc_48aacb32-c120-4f36-898b-60f5d01c5510/manager/0.log" Jan 20 19:27:29 crc kubenswrapper[4773]: I0120 19:27:29.921070 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-v4q7f_ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.171367 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-4kk2r_4604c39e-62d8-4420-b2bc-54d44f4ebcd0/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.196025 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-vjfdq_a570d5a5-53f4-444f-a14d-92ea24f27e2e/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.321986 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-blxqv_d1051db2-8914-422b-a126-5cd8ee078767/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.809560 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-hqsb9_437cadd4-5809-4b9e-afa2-05832cd6c303/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.825255 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-2nhdr_951d4f5c-5d89-41c6-be8a-9828b05ce182/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.898464 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-8tsjs_b773ecb8-3505-44ad-a28f-bd4054263888/manager/0.log" Jan 20 19:27:30 crc kubenswrapper[4773]: I0120 19:27:30.950042 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-mqjmm_ed6d3389-b374-42a6-8101-1d34df737170/manager/0.log" Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.021654 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-s7scg_8f795216-0196-4a5a-bfdf-20dee1543b43/manager/0.log" Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.099888 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-hfwzv_b196e443-f058-49c2-b54b-a18656415f5a/manager/0.log" Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.192277 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-prhbl_fb5406b5-d194-441a-a098-7ecdc7831ec1/manager/0.log" Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.210008 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-sslnl_ff53e5c0-255a-43c5-a27c-ce9dc3145999/manager/0.log" Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.232397 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854m7knp_e9f6d4b3-c2cc-4cc6-b279-362e7439974b/manager/0.log" Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.336206 4773 generic.go:334] "Generic (PLEG): container finished" podID="630b0da4-d7f7-4f6e-8489-66087b5b8974" containerID="850d3bec4e1a476b5a9d345935605675d1c6c1592f339240f8819f6c39afef82" exitCode=0 Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.336260 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" event={"ID":"630b0da4-d7f7-4f6e-8489-66087b5b8974","Type":"ContainerDied","Data":"850d3bec4e1a476b5a9d345935605675d1c6c1592f339240f8819f6c39afef82"} Jan 20 19:27:31 crc kubenswrapper[4773]: I0120 19:27:31.397756 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5df999bcf5-pztzb_e2d598ad-b9fa-4874-8669-688e18171e82/operator/0.log" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.451381 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.500065 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mzdmf/crc-debug-txz6z"] Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.519234 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mzdmf/crc-debug-txz6z"] Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.577233 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630b0da4-d7f7-4f6e-8489-66087b5b8974-host\") pod \"630b0da4-d7f7-4f6e-8489-66087b5b8974\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.577360 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j5cm7\" (UniqueName: \"kubernetes.io/projected/630b0da4-d7f7-4f6e-8489-66087b5b8974-kube-api-access-j5cm7\") pod \"630b0da4-d7f7-4f6e-8489-66087b5b8974\" (UID: \"630b0da4-d7f7-4f6e-8489-66087b5b8974\") " Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.577574 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/630b0da4-d7f7-4f6e-8489-66087b5b8974-host" (OuterVolumeSpecName: "host") pod "630b0da4-d7f7-4f6e-8489-66087b5b8974" (UID: "630b0da4-d7f7-4f6e-8489-66087b5b8974"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.578250 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/630b0da4-d7f7-4f6e-8489-66087b5b8974-host\") on node \"crc\" DevicePath \"\"" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.588216 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630b0da4-d7f7-4f6e-8489-66087b5b8974-kube-api-access-j5cm7" (OuterVolumeSpecName: "kube-api-access-j5cm7") pod "630b0da4-d7f7-4f6e-8489-66087b5b8974" (UID: "630b0da4-d7f7-4f6e-8489-66087b5b8974"). InnerVolumeSpecName "kube-api-access-j5cm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.680036 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j5cm7\" (UniqueName: \"kubernetes.io/projected/630b0da4-d7f7-4f6e-8489-66087b5b8974-kube-api-access-j5cm7\") on node \"crc\" DevicePath \"\"" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.818387 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-674cd49df-nnf4r_86d68359-5910-4d1d-8a01-2964f8d26464/manager/0.log" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.827228 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zjdsq_ba4d7dc5-ceca-4e4b-81af-9368937b7462/registry-server/0.log" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.887419 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-6ngwx_a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3/manager/0.log" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.922346 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-26j8t_2601732b-921a-4c55-821b-0fc994c50236/manager/0.log" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.942991 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-r5qgh_99558a40-3dbc-4c2b-9aab-a085c7ef5c7c/operator/0.log" Jan 20 19:27:32 crc kubenswrapper[4773]: I0120 19:27:32.958984 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-t8tmg_9e235ee6-33ad-40e3-9b7a-914820315627/manager/0.log" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.049321 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-2thqw_7ed73202-faba-46ba-ae91-8cd9ffbe70a4/manager/0.log" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.058022 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7f4549b895-p2vwt_cfba823f-e85e-42ae-aa8a-7926cc906b92/manager/0.log" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.068858 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-nhqxg_7f740208-043d-4d7f-b533-5526833d10c2/manager/0.log" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.353460 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f386fe57ce0e4edf2c60cf0dce4de5d1e38e9de41ab63c8c3982abcc44b3b407" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.353541 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-txz6z" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.460742 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630b0da4-d7f7-4f6e-8489-66087b5b8974" path="/var/lib/kubelet/pods/630b0da4-d7f7-4f6e-8489-66087b5b8974/volumes" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.675577 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-mzdmf/crc-debug-gfrv9"] Jan 20 19:27:33 crc kubenswrapper[4773]: E0120 19:27:33.676240 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630b0da4-d7f7-4f6e-8489-66087b5b8974" containerName="container-00" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.676343 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="630b0da4-d7f7-4f6e-8489-66087b5b8974" containerName="container-00" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.676593 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="630b0da4-d7f7-4f6e-8489-66087b5b8974" containerName="container-00" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.677431 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.679696 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-mzdmf"/"default-dockercfg-6z6zw" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.799254 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4khzb\" (UniqueName: \"kubernetes.io/projected/5ec34924-f3b4-407a-ae92-3e002b13c954-kube-api-access-4khzb\") pod \"crc-debug-gfrv9\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.799624 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec34924-f3b4-407a-ae92-3e002b13c954-host\") pod \"crc-debug-gfrv9\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.901974 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4khzb\" (UniqueName: \"kubernetes.io/projected/5ec34924-f3b4-407a-ae92-3e002b13c954-kube-api-access-4khzb\") pod \"crc-debug-gfrv9\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.902101 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec34924-f3b4-407a-ae92-3e002b13c954-host\") pod \"crc-debug-gfrv9\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.902291 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec34924-f3b4-407a-ae92-3e002b13c954-host\") pod \"crc-debug-gfrv9\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.926515 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4khzb\" (UniqueName: \"kubernetes.io/projected/5ec34924-f3b4-407a-ae92-3e002b13c954-kube-api-access-4khzb\") pod \"crc-debug-gfrv9\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:33 crc kubenswrapper[4773]: I0120 19:27:33.992945 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:34 crc kubenswrapper[4773]: W0120 19:27:34.039352 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5ec34924_f3b4_407a_ae92_3e002b13c954.slice/crio-182ef222cc4a819e22f9ddc9715ff48b89f3cccaa0ef73c735edecbb71b65663 WatchSource:0}: Error finding container 182ef222cc4a819e22f9ddc9715ff48b89f3cccaa0ef73c735edecbb71b65663: Status 404 returned error can't find the container with id 182ef222cc4a819e22f9ddc9715ff48b89f3cccaa0ef73c735edecbb71b65663 Jan 20 19:27:34 crc kubenswrapper[4773]: I0120 19:27:34.371646 4773 generic.go:334] "Generic (PLEG): container finished" podID="5ec34924-f3b4-407a-ae92-3e002b13c954" containerID="9bfb4ee2f9a7a03638cbc3d613dcbada38edd065d9b016e140a975b051ddfb05" exitCode=1 Jan 20 19:27:34 crc kubenswrapper[4773]: I0120 19:27:34.371737 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" event={"ID":"5ec34924-f3b4-407a-ae92-3e002b13c954","Type":"ContainerDied","Data":"9bfb4ee2f9a7a03638cbc3d613dcbada38edd065d9b016e140a975b051ddfb05"} Jan 20 19:27:34 crc kubenswrapper[4773]: I0120 19:27:34.372099 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" event={"ID":"5ec34924-f3b4-407a-ae92-3e002b13c954","Type":"ContainerStarted","Data":"182ef222cc4a819e22f9ddc9715ff48b89f3cccaa0ef73c735edecbb71b65663"} Jan 20 19:27:34 crc kubenswrapper[4773]: I0120 19:27:34.412737 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mzdmf/crc-debug-gfrv9"] Jan 20 19:27:34 crc kubenswrapper[4773]: I0120 19:27:34.427617 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mzdmf/crc-debug-gfrv9"] Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.485983 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.631162 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec34924-f3b4-407a-ae92-3e002b13c954-host\") pod \"5ec34924-f3b4-407a-ae92-3e002b13c954\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.631210 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5ec34924-f3b4-407a-ae92-3e002b13c954-host" (OuterVolumeSpecName: "host") pod "5ec34924-f3b4-407a-ae92-3e002b13c954" (UID: "5ec34924-f3b4-407a-ae92-3e002b13c954"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.631305 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4khzb\" (UniqueName: \"kubernetes.io/projected/5ec34924-f3b4-407a-ae92-3e002b13c954-kube-api-access-4khzb\") pod \"5ec34924-f3b4-407a-ae92-3e002b13c954\" (UID: \"5ec34924-f3b4-407a-ae92-3e002b13c954\") " Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.631851 4773 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5ec34924-f3b4-407a-ae92-3e002b13c954-host\") on node \"crc\" DevicePath \"\"" Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.643271 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5ec34924-f3b4-407a-ae92-3e002b13c954-kube-api-access-4khzb" (OuterVolumeSpecName: "kube-api-access-4khzb") pod "5ec34924-f3b4-407a-ae92-3e002b13c954" (UID: "5ec34924-f3b4-407a-ae92-3e002b13c954"). InnerVolumeSpecName "kube-api-access-4khzb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:27:35 crc kubenswrapper[4773]: I0120 19:27:35.754767 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4khzb\" (UniqueName: \"kubernetes.io/projected/5ec34924-f3b4-407a-ae92-3e002b13c954-kube-api-access-4khzb\") on node \"crc\" DevicePath \"\"" Jan 20 19:27:36 crc kubenswrapper[4773]: I0120 19:27:36.397052 4773 scope.go:117] "RemoveContainer" containerID="9bfb4ee2f9a7a03638cbc3d613dcbada38edd065d9b016e140a975b051ddfb05" Jan 20 19:27:36 crc kubenswrapper[4773]: I0120 19:27:36.397501 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/crc-debug-gfrv9" Jan 20 19:27:36 crc kubenswrapper[4773]: I0120 19:27:36.708447 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-857hw_1e5ac136-d46c-45e3-9a5f-548ac22fac5c/control-plane-machine-set-operator/0.log" Jan 20 19:27:36 crc kubenswrapper[4773]: I0120 19:27:36.726379 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bhrll_a99225b3-64c7-4b39-807c-c97faa919977/kube-rbac-proxy/0.log" Jan 20 19:27:36 crc kubenswrapper[4773]: I0120 19:27:36.736508 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bhrll_a99225b3-64c7-4b39-807c-c97faa919977/machine-api-operator/0.log" Jan 20 19:27:37 crc kubenswrapper[4773]: I0120 19:27:37.458308 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ec34924-f3b4-407a-ae92-3e002b13c954" path="/var/lib/kubelet/pods/5ec34924-f3b4-407a-ae92-3e002b13c954/volumes" Jan 20 19:28:18 crc kubenswrapper[4773]: I0120 19:28:18.856197 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-tgrsg_5a2416cd-d7d8-4aa5-b7ef-1b61446a4072/cert-manager-controller/0.log" Jan 20 19:28:18 crc kubenswrapper[4773]: I0120 19:28:18.872698 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-mtkdb_c249258b-878c-45b8-9886-6fee2afec18c/cert-manager-cainjector/0.log" Jan 20 19:28:18 crc kubenswrapper[4773]: I0120 19:28:18.887836 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cf2ql_4380dd47-7110-43ea-af85-02675b558a8d/cert-manager-webhook/0.log" Jan 20 19:28:23 crc kubenswrapper[4773]: I0120 19:28:23.758286 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-z2s7z_431c5397-9244-4083-9659-59210fd6d5c0/nmstate-console-plugin/0.log" Jan 20 19:28:23 crc kubenswrapper[4773]: I0120 19:28:23.771283 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-q9h42_ef435627-8918-4451-8d3a-23e494e29f56/nmstate-handler/0.log" Jan 20 19:28:23 crc kubenswrapper[4773]: I0120 19:28:23.795647 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kxbnw_42822a21-0834-4fc7-aab5-4dcdf46f2786/nmstate-metrics/0.log" Jan 20 19:28:23 crc kubenswrapper[4773]: I0120 19:28:23.804766 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kxbnw_42822a21-0834-4fc7-aab5-4dcdf46f2786/kube-rbac-proxy/0.log" Jan 20 19:28:23 crc kubenswrapper[4773]: I0120 19:28:23.817320 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-bwjbw_a0e928f6-ac84-4903-ab0e-08557dea077f/nmstate-operator/0.log" Jan 20 19:28:23 crc kubenswrapper[4773]: I0120 19:28:23.827430 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-8q4jc_9380b21a-b971-4bb9-9572-d795f171b941/nmstate-webhook/0.log" Jan 20 19:28:34 crc kubenswrapper[4773]: I0120 19:28:34.164641 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2nprh_e284563a-e5f1-4c86-8100-863f86a7f7dc/controller/0.log" Jan 20 19:28:34 crc kubenswrapper[4773]: I0120 19:28:34.172151 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2nprh_e284563a-e5f1-4c86-8100-863f86a7f7dc/kube-rbac-proxy/0.log" Jan 20 19:28:34 crc kubenswrapper[4773]: I0120 19:28:34.190796 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/controller/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.558342 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/frr/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.570281 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/reloader/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.574323 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/frr-metrics/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.581166 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/kube-rbac-proxy/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.587505 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/kube-rbac-proxy-frr/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.592860 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-frr-files/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.599655 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-reloader/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.607162 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-metrics/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.618232 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-vv9kz_05a83b70-ac51-4951-92c6-0f90265f2958/frr-k8s-webhook-server/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.642227 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58b89bff97-7jrvm_a93bbf26-2683-4cf0-a45a-1639d6da4e01/manager/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.653002 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6cbdbfd488-hxn9x_cbba9cb2-22ec-4f8c-8550-f3a69901785c/webhook-server/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.953827 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xdr9_ce74c2a6-61b7-4fb4-883f-e86bf4b5c604/speaker/0.log" Jan 20 19:28:35 crc kubenswrapper[4773]: I0120 19:28:35.961283 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xdr9_ce74c2a6-61b7-4fb4-883f-e86bf4b5c604/kube-rbac-proxy/0.log" Jan 20 19:28:39 crc kubenswrapper[4773]: I0120 19:28:39.326884 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g_8e0b8536-2fc2-4203-a22f-a2dc29d0b737/extract/0.log" Jan 20 19:28:39 crc kubenswrapper[4773]: I0120 19:28:39.333497 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g_8e0b8536-2fc2-4203-a22f-a2dc29d0b737/util/0.log" Jan 20 19:28:39 crc kubenswrapper[4773]: I0120 19:28:39.342509 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dcjvg7g_8e0b8536-2fc2-4203-a22f-a2dc29d0b737/pull/0.log" Jan 20 19:28:39 crc kubenswrapper[4773]: I0120 19:28:39.354815 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd_7c79fd0a-1d41-44db-8ee4-d5781d77e848/extract/0.log" Jan 20 19:28:39 crc kubenswrapper[4773]: I0120 19:28:39.363168 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd_7c79fd0a-1d41-44db-8ee4-d5781d77e848/util/0.log" Jan 20 19:28:39 crc kubenswrapper[4773]: I0120 19:28:39.375152 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_53efe8611d43ac2275911d954e05efbbba7920a530aff9253ed1cec713cdtcd_7c79fd0a-1d41-44db-8ee4-d5781d77e848/pull/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.155481 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnll4_5da64480-a8e7-4ab9-b438-dfe067f94091/registry-server/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.162279 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnll4_5da64480-a8e7-4ab9-b438-dfe067f94091/extract-utilities/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.170112 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-lnll4_5da64480-a8e7-4ab9-b438-dfe067f94091/extract-content/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.629106 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cfwbf_a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3/registry-server/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.634415 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cfwbf_a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3/extract-utilities/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.641201 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-cfwbf_a8ccd26b-7e5c-4655-a9cb-764a2d7d35d3/extract-content/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.661744 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-kcc74_785e6f78-9a81-429e-8cad-f60275661e58/marketplace-operator/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.835679 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wdwbg_379f8421-1b6c-45c5-ae56-051b42ff6410/registry-server/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.840559 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wdwbg_379f8421-1b6c-45c5-ae56-051b42ff6410/extract-utilities/0.log" Jan 20 19:28:40 crc kubenswrapper[4773]: I0120 19:28:40.847548 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-wdwbg_379f8421-1b6c-45c5-ae56-051b42ff6410/extract-content/0.log" Jan 20 19:28:41 crc kubenswrapper[4773]: I0120 19:28:41.400431 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r24nn_7962399c-d4d0-44f1-a788-bd4cb5a758d7/registry-server/0.log" Jan 20 19:28:41 crc kubenswrapper[4773]: I0120 19:28:41.405511 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r24nn_7962399c-d4d0-44f1-a788-bd4cb5a758d7/extract-utilities/0.log" Jan 20 19:28:41 crc kubenswrapper[4773]: I0120 19:28:41.413572 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-r24nn_7962399c-d4d0-44f1-a788-bd4cb5a758d7/extract-content/0.log" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.913873 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2c6jv"] Jan 20 19:29:24 crc kubenswrapper[4773]: E0120 19:29:24.914732 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5ec34924-f3b4-407a-ae92-3e002b13c954" containerName="container-00" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.914745 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ec34924-f3b4-407a-ae92-3e002b13c954" containerName="container-00" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.914969 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="5ec34924-f3b4-407a-ae92-3e002b13c954" containerName="container-00" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.916220 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.943577 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2c6jv"] Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.973576 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-catalog-content\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.973703 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn5dk\" (UniqueName: \"kubernetes.io/projected/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-kube-api-access-tn5dk\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:24 crc kubenswrapper[4773]: I0120 19:29:24.973992 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-utilities\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.076287 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn5dk\" (UniqueName: \"kubernetes.io/projected/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-kube-api-access-tn5dk\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.076462 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-utilities\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.076529 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-catalog-content\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.077016 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-utilities\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.077055 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-catalog-content\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.105658 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn5dk\" (UniqueName: \"kubernetes.io/projected/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-kube-api-access-tn5dk\") pod \"community-operators-2c6jv\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.238626 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:25 crc kubenswrapper[4773]: I0120 19:29:25.857141 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2c6jv"] Jan 20 19:29:26 crc kubenswrapper[4773]: I0120 19:29:26.280423 4773 generic.go:334] "Generic (PLEG): container finished" podID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerID="c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e" exitCode=0 Jan 20 19:29:26 crc kubenswrapper[4773]: I0120 19:29:26.280523 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c6jv" event={"ID":"520f286f-ac9f-40aa-939b-2a4cd53ebbd0","Type":"ContainerDied","Data":"c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e"} Jan 20 19:29:26 crc kubenswrapper[4773]: I0120 19:29:26.282571 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c6jv" event={"ID":"520f286f-ac9f-40aa-939b-2a4cd53ebbd0","Type":"ContainerStarted","Data":"6678b7da17cb66288f2199f90a24c29f56c670b8a88a035f6107e08d3e5e1e1e"} Jan 20 19:29:28 crc kubenswrapper[4773]: I0120 19:29:28.170456 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:29:28 crc kubenswrapper[4773]: I0120 19:29:28.171033 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:29:28 crc kubenswrapper[4773]: I0120 19:29:28.297902 4773 generic.go:334] "Generic (PLEG): container finished" podID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerID="2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056" exitCode=0 Jan 20 19:29:28 crc kubenswrapper[4773]: I0120 19:29:28.297972 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c6jv" event={"ID":"520f286f-ac9f-40aa-939b-2a4cd53ebbd0","Type":"ContainerDied","Data":"2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056"} Jan 20 19:29:29 crc kubenswrapper[4773]: I0120 19:29:29.308339 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c6jv" event={"ID":"520f286f-ac9f-40aa-939b-2a4cd53ebbd0","Type":"ContainerStarted","Data":"5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720"} Jan 20 19:29:29 crc kubenswrapper[4773]: I0120 19:29:29.327380 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2c6jv" podStartSLOduration=2.581570052 podStartE2EDuration="5.327355824s" podCreationTimestamp="2026-01-20 19:29:24 +0000 UTC" firstStartedPulling="2026-01-20 19:29:26.282368346 +0000 UTC m=+3559.204181370" lastFinishedPulling="2026-01-20 19:29:29.028154118 +0000 UTC m=+3561.949967142" observedRunningTime="2026-01-20 19:29:29.325202482 +0000 UTC m=+3562.247015526" watchObservedRunningTime="2026-01-20 19:29:29.327355824 +0000 UTC m=+3562.249168848" Jan 20 19:29:35 crc kubenswrapper[4773]: I0120 19:29:35.239605 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:35 crc kubenswrapper[4773]: I0120 19:29:35.240206 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:35 crc kubenswrapper[4773]: I0120 19:29:35.294955 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:35 crc kubenswrapper[4773]: I0120 19:29:35.409102 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:35 crc kubenswrapper[4773]: I0120 19:29:35.542077 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2c6jv"] Jan 20 19:29:37 crc kubenswrapper[4773]: I0120 19:29:37.379015 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2c6jv" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="registry-server" containerID="cri-o://5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720" gracePeriod=2 Jan 20 19:29:37 crc kubenswrapper[4773]: I0120 19:29:37.893770 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.040151 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-utilities\") pod \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.040227 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn5dk\" (UniqueName: \"kubernetes.io/projected/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-kube-api-access-tn5dk\") pod \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.040491 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-catalog-content\") pod \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\" (UID: \"520f286f-ac9f-40aa-939b-2a4cd53ebbd0\") " Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.044565 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-utilities" (OuterVolumeSpecName: "utilities") pod "520f286f-ac9f-40aa-939b-2a4cd53ebbd0" (UID: "520f286f-ac9f-40aa-939b-2a4cd53ebbd0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.050217 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-kube-api-access-tn5dk" (OuterVolumeSpecName: "kube-api-access-tn5dk") pod "520f286f-ac9f-40aa-939b-2a4cd53ebbd0" (UID: "520f286f-ac9f-40aa-939b-2a4cd53ebbd0"). InnerVolumeSpecName "kube-api-access-tn5dk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.111361 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "520f286f-ac9f-40aa-939b-2a4cd53ebbd0" (UID: "520f286f-ac9f-40aa-939b-2a4cd53ebbd0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.142635 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.142673 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn5dk\" (UniqueName: \"kubernetes.io/projected/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-kube-api-access-tn5dk\") on node \"crc\" DevicePath \"\"" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.142687 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/520f286f-ac9f-40aa-939b-2a4cd53ebbd0-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.388984 4773 generic.go:334] "Generic (PLEG): container finished" podID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerID="5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720" exitCode=0 Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.389030 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c6jv" event={"ID":"520f286f-ac9f-40aa-939b-2a4cd53ebbd0","Type":"ContainerDied","Data":"5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720"} Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.389056 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2c6jv" event={"ID":"520f286f-ac9f-40aa-939b-2a4cd53ebbd0","Type":"ContainerDied","Data":"6678b7da17cb66288f2199f90a24c29f56c670b8a88a035f6107e08d3e5e1e1e"} Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.389063 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2c6jv" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.389076 4773 scope.go:117] "RemoveContainer" containerID="5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.407797 4773 scope.go:117] "RemoveContainer" containerID="2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.446821 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2c6jv"] Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.448887 4773 scope.go:117] "RemoveContainer" containerID="c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.463503 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2c6jv"] Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.502351 4773 scope.go:117] "RemoveContainer" containerID="5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720" Jan 20 19:29:38 crc kubenswrapper[4773]: E0120 19:29:38.502816 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720\": container with ID starting with 5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720 not found: ID does not exist" containerID="5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.502858 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720"} err="failed to get container status \"5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720\": rpc error: code = NotFound desc = could not find container \"5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720\": container with ID starting with 5e7d46e8f3a0da2eee3dd83d47ede32d81eb719ca678fa1fb0b7a23651b35720 not found: ID does not exist" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.502885 4773 scope.go:117] "RemoveContainer" containerID="2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056" Jan 20 19:29:38 crc kubenswrapper[4773]: E0120 19:29:38.503418 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056\": container with ID starting with 2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056 not found: ID does not exist" containerID="2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.503463 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056"} err="failed to get container status \"2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056\": rpc error: code = NotFound desc = could not find container \"2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056\": container with ID starting with 2e340087c6d9044232703d8faa28bdfae58aa264afe92f6a31f2c5bfc2eec056 not found: ID does not exist" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.503504 4773 scope.go:117] "RemoveContainer" containerID="c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e" Jan 20 19:29:38 crc kubenswrapper[4773]: E0120 19:29:38.503902 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e\": container with ID starting with c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e not found: ID does not exist" containerID="c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e" Jan 20 19:29:38 crc kubenswrapper[4773]: I0120 19:29:38.503969 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e"} err="failed to get container status \"c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e\": rpc error: code = NotFound desc = could not find container \"c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e\": container with ID starting with c127bcabf57e68bcc6face14a11e9b4543740fac86463e8ee50b8c84bc35123e not found: ID does not exist" Jan 20 19:29:39 crc kubenswrapper[4773]: I0120 19:29:39.458245 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" path="/var/lib/kubelet/pods/520f286f-ac9f-40aa-939b-2a4cd53ebbd0/volumes" Jan 20 19:29:50 crc kubenswrapper[4773]: I0120 19:29:50.898967 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2nprh_e284563a-e5f1-4c86-8100-863f86a7f7dc/controller/0.log" Jan 20 19:29:50 crc kubenswrapper[4773]: I0120 19:29:50.931841 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-2nprh_e284563a-e5f1-4c86-8100-863f86a7f7dc/kube-rbac-proxy/0.log" Jan 20 19:29:51 crc kubenswrapper[4773]: I0120 19:29:51.022526 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/controller/0.log" Jan 20 19:29:51 crc kubenswrapper[4773]: I0120 19:29:51.233388 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-tgrsg_5a2416cd-d7d8-4aa5-b7ef-1b61446a4072/cert-manager-controller/0.log" Jan 20 19:29:51 crc kubenswrapper[4773]: I0120 19:29:51.252729 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-mtkdb_c249258b-878c-45b8-9886-6fee2afec18c/cert-manager-cainjector/0.log" Jan 20 19:29:51 crc kubenswrapper[4773]: I0120 19:29:51.271994 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cf2ql_4380dd47-7110-43ea-af85-02675b558a8d/cert-manager-webhook/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.332366 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/frr/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.341582 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/reloader/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.346675 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/frr-metrics/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.354826 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/kube-rbac-proxy/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.361285 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/kube-rbac-proxy-frr/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.373889 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-frr-files/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.382406 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-reloader/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.390718 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-l2dcx_859ada1b-1a7b-4032-a974-2ec3571aa069/cp-metrics/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.392144 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/extract/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.398672 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/util/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.404765 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-vv9kz_05a83b70-ac51-4951-92c6-0f90265f2958/frr-k8s-webhook-server/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.405268 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/pull/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.460968 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-58b89bff97-7jrvm_a93bbf26-2683-4cf0-a45a-1639d6da4e01/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.472278 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-6cbdbfd488-hxn9x_cbba9cb2-22ec-4f8c-8550-f3a69901785c/webhook-server/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.513426 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-hhxlp_df2d6d5b-b964-4672-903f-563b7792ee43/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.609685 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-xmljc_48aacb32-c120-4f36-898b-60f5d01c5510/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.626206 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-v4q7f_ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.722833 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-4kk2r_4604c39e-62d8-4420-b2bc-54d44f4ebcd0/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.753123 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-vjfdq_a570d5a5-53f4-444f-a14d-92ea24f27e2e/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.790883 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-blxqv_d1051db2-8914-422b-a126-5cd8ee078767/manager/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.938901 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xdr9_ce74c2a6-61b7-4fb4-883f-e86bf4b5c604/speaker/0.log" Jan 20 19:29:52 crc kubenswrapper[4773]: I0120 19:29:52.948419 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-7xdr9_ce74c2a6-61b7-4fb4-883f-e86bf4b5c604/kube-rbac-proxy/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.155794 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-hqsb9_437cadd4-5809-4b9e-afa2-05832cd6c303/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.165033 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-2nhdr_951d4f5c-5d89-41c6-be8a-9828b05ce182/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.247049 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-8tsjs_b773ecb8-3505-44ad-a28f-bd4054263888/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.303748 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-mqjmm_ed6d3389-b374-42a6-8101-1d34df737170/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.343813 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-s7scg_8f795216-0196-4a5a-bfdf-20dee1543b43/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.402815 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-hfwzv_b196e443-f058-49c2-b54b-a18656415f5a/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.532972 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-prhbl_fb5406b5-d194-441a-a098-7ecdc7831ec1/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.546614 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-sslnl_ff53e5c0-255a-43c5-a27c-ce9dc3145999/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.562871 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854m7knp_e9f6d4b3-c2cc-4cc6-b279-362e7439974b/manager/0.log" Jan 20 19:29:53 crc kubenswrapper[4773]: I0120 19:29:53.756568 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5df999bcf5-pztzb_e2d598ad-b9fa-4874-8669-688e18171e82/operator/0.log" Jan 20 19:29:54 crc kubenswrapper[4773]: I0120 19:29:54.107224 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-tgrsg_5a2416cd-d7d8-4aa5-b7ef-1b61446a4072/cert-manager-controller/0.log" Jan 20 19:29:54 crc kubenswrapper[4773]: I0120 19:29:54.123981 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-mtkdb_c249258b-878c-45b8-9886-6fee2afec18c/cert-manager-cainjector/0.log" Jan 20 19:29:54 crc kubenswrapper[4773]: I0120 19:29:54.137374 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-cf2ql_4380dd47-7110-43ea-af85-02675b558a8d/cert-manager-webhook/0.log" Jan 20 19:29:54 crc kubenswrapper[4773]: I0120 19:29:54.852282 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-857hw_1e5ac136-d46c-45e3-9a5f-548ac22fac5c/control-plane-machine-set-operator/0.log" Jan 20 19:29:54 crc kubenswrapper[4773]: I0120 19:29:54.868060 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bhrll_a99225b3-64c7-4b39-807c-c97faa919977/kube-rbac-proxy/0.log" Jan 20 19:29:54 crc kubenswrapper[4773]: I0120 19:29:54.876585 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-bhrll_a99225b3-64c7-4b39-807c-c97faa919977/machine-api-operator/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.067508 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-674cd49df-nnf4r_86d68359-5910-4d1d-8a01-2964f8d26464/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.078808 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zjdsq_ba4d7dc5-ceca-4e4b-81af-9368937b7462/registry-server/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.132653 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-6ngwx_a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.155542 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-26j8t_2601732b-921a-4c55-821b-0fc994c50236/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.175234 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-r5qgh_99558a40-3dbc-4c2b-9aab-a085c7ef5c7c/operator/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.183908 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-t8tmg_9e235ee6-33ad-40e3-9b7a-914820315627/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.266897 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-2thqw_7ed73202-faba-46ba-ae91-8cd9ffbe70a4/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.275962 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7f4549b895-p2vwt_cfba823f-e85e-42ae-aa8a-7926cc906b92/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.286137 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-nhqxg_7f740208-043d-4d7f-b533-5526833d10c2/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.616341 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/extract/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.621436 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/util/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.630073 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_1fa7137562a22aefab0f17b7f11474d7c5f30ed4d2dba8dd42e74a795drlmtp_7af94832-1f61-43d7-9c56-bee4b2893499/pull/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.711506 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-7ddb5c749-hhxlp_df2d6d5b-b964-4672-903f-563b7792ee43/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.765631 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-9b68f5989-xmljc_48aacb32-c120-4f36-898b-60f5d01c5510/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.778443 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-9f958b845-v4q7f_ac02d392-7ff9-42e1-ad6f-47ab9f04a9a7/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.827703 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-c6994669c-4kk2r_4604c39e-62d8-4420-b2bc-54d44f4ebcd0/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.837443 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-594c8c9d5d-vjfdq_a570d5a5-53f4-444f-a14d-92ea24f27e2e/manager/0.log" Jan 20 19:29:55 crc kubenswrapper[4773]: I0120 19:29:55.864476 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-77d5c5b54f-blxqv_d1051db2-8914-422b-a126-5cd8ee078767/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.237695 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-77c48c7859-hqsb9_437cadd4-5809-4b9e-afa2-05832cd6c303/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.251647 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-78757b4889-2nhdr_951d4f5c-5d89-41c6-be8a-9828b05ce182/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.332728 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-767fdc4f47-8tsjs_b773ecb8-3505-44ad-a28f-bd4054263888/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.374989 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-864f6b75bf-mqjmm_ed6d3389-b374-42a6-8101-1d34df737170/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.406685 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-c87fff755-s7scg_8f795216-0196-4a5a-bfdf-20dee1543b43/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.451708 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-cb4666565-hfwzv_b196e443-f058-49c2-b54b-a18656415f5a/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.526265 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-65849867d6-prhbl_fb5406b5-d194-441a-a098-7ecdc7831ec1/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.538033 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-7fc9b76cf6-sslnl_ff53e5c0-255a-43c5-a27c-ce9dc3145999/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.554751 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-6b68b8b854m7knp_e9f6d4b3-c2cc-4cc6-b279-362e7439974b/manager/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.678440 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-7754f76f8b-z2s7z_431c5397-9244-4083-9659-59210fd6d5c0/nmstate-console-plugin/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.696421 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-q9h42_ef435627-8918-4451-8d3a-23e494e29f56/nmstate-handler/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.714708 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kxbnw_42822a21-0834-4fc7-aab5-4dcdf46f2786/nmstate-metrics/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.723275 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-5df999bcf5-pztzb_e2d598ad-b9fa-4874-8669-688e18171e82/operator/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.723663 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-54757c584b-kxbnw_42822a21-0834-4fc7-aab5-4dcdf46f2786/kube-rbac-proxy/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.737258 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-646758c888-bwjbw_a0e928f6-ac84-4903-ab0e-08557dea077f/nmstate-operator/0.log" Jan 20 19:29:56 crc kubenswrapper[4773]: I0120 19:29:56.747363 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-8474b5b9d8-8q4jc_9380b21a-b971-4bb9-9572-d795f171b941/nmstate-webhook/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.148323 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-674cd49df-nnf4r_86d68359-5910-4d1d-8a01-2964f8d26464/manager/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.162829 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-zjdsq_ba4d7dc5-ceca-4e4b-81af-9368937b7462/registry-server/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.174682 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.174979 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.251687 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-55db956ddc-6ngwx_a1b3e0e3-f4c7-4b3d-9ba0-a198be108cb3/manager/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.288509 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-686df47fcb-26j8t_2601732b-921a-4c55-821b-0fc994c50236/manager/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.317571 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-r5qgh_99558a40-3dbc-4c2b-9aab-a085c7ef5c7c/operator/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.333312 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-85dd56d4cc-t8tmg_9e235ee6-33ad-40e3-9b7a-914820315627/manager/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.434550 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-5f8f495fcf-2thqw_7ed73202-faba-46ba-ae91-8cd9ffbe70a4/manager/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.450167 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-7f4549b895-p2vwt_cfba823f-e85e-42ae-aa8a-7926cc906b92/manager/0.log" Jan 20 19:29:58 crc kubenswrapper[4773]: I0120 19:29:58.462310 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-64cd966744-nhqxg_7f740208-043d-4d7f-b533-5526833d10c2/manager/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.027823 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/kube-multus-additional-cni-plugins/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.035910 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/egress-router-binary-copy/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.043259 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/cni-plugins/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.049483 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/bond-cni-plugin/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.056640 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/routeoverride-cni/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.061761 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/whereabouts-cni-bincopy/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.068561 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-kjbfj_7ddd5104-3112-413e-b908-2b7f336b41f1/whereabouts-cni/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.101081 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-x6fwb_deccf4fe-9230-4e96-b16c-a2ed0d2235a7/multus-admission-controller/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.113869 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-857f4d67dd-x6fwb_deccf4fe-9230-4e96-b16c-a2ed0d2235a7/kube-rbac-proxy/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.145219 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9"] Jan 20 19:30:00 crc kubenswrapper[4773]: E0120 19:30:00.145697 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="registry-server" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.145721 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="registry-server" Jan 20 19:30:00 crc kubenswrapper[4773]: E0120 19:30:00.145739 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="extract-utilities" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.145746 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="extract-utilities" Jan 20 19:30:00 crc kubenswrapper[4773]: E0120 19:30:00.145759 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="extract-content" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.145765 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="extract-content" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.145998 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="520f286f-ac9f-40aa-939b-2a4cd53ebbd0" containerName="registry-server" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.146758 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.148922 4773 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.149734 4773 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.156564 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9"] Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.178230 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-secret-volume\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.178288 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-config-volume\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.178515 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nctvg\" (UniqueName: \"kubernetes.io/projected/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-kube-api-access-nctvg\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.184715 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/2.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.269810 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bccxn_061a607e-1868-4fcf-b3ea-d51157511d41/kube-multus/3.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.280383 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-secret-volume\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.280437 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-config-volume\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.280527 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nctvg\" (UniqueName: \"kubernetes.io/projected/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-kube-api-access-nctvg\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.281299 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-config-volume\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.286271 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-secret-volume\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.294900 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4jpbd_3791c4b7-dcef-470d-a67e-a2c0bb004436/network-metrics-daemon/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.300673 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-4jpbd_3791c4b7-dcef-470d-a67e-a2c0bb004436/kube-rbac-proxy/0.log" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.302604 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nctvg\" (UniqueName: \"kubernetes.io/projected/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-kube-api-access-nctvg\") pod \"collect-profiles-29482290-wn7c9\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.470226 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:00 crc kubenswrapper[4773]: I0120 19:30:00.891204 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9"] Jan 20 19:30:01 crc kubenswrapper[4773]: I0120 19:30:01.611730 4773 generic.go:334] "Generic (PLEG): container finished" podID="a79f9bd1-bbc7-4506-b585-7f152b5f73f6" containerID="42fb582633ba9007174eeae5e881990f39239a506ecbe079eebae42539c1b6a9" exitCode=0 Jan 20 19:30:01 crc kubenswrapper[4773]: I0120 19:30:01.612003 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" event={"ID":"a79f9bd1-bbc7-4506-b585-7f152b5f73f6","Type":"ContainerDied","Data":"42fb582633ba9007174eeae5e881990f39239a506ecbe079eebae42539c1b6a9"} Jan 20 19:30:01 crc kubenswrapper[4773]: I0120 19:30:01.612063 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" event={"ID":"a79f9bd1-bbc7-4506-b585-7f152b5f73f6","Type":"ContainerStarted","Data":"3135d289fe27a5514b5fed26d3f258e738419a2758c2552f386fe8f9e5775839"} Jan 20 19:30:02 crc kubenswrapper[4773]: I0120 19:30:02.916229 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.037176 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nctvg\" (UniqueName: \"kubernetes.io/projected/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-kube-api-access-nctvg\") pod \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.037406 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-config-volume\") pod \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.037482 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-secret-volume\") pod \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\" (UID: \"a79f9bd1-bbc7-4506-b585-7f152b5f73f6\") " Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.038168 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-config-volume" (OuterVolumeSpecName: "config-volume") pod "a79f9bd1-bbc7-4506-b585-7f152b5f73f6" (UID: "a79f9bd1-bbc7-4506-b585-7f152b5f73f6"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.043668 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "a79f9bd1-bbc7-4506-b585-7f152b5f73f6" (UID: "a79f9bd1-bbc7-4506-b585-7f152b5f73f6"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.044384 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-kube-api-access-nctvg" (OuterVolumeSpecName: "kube-api-access-nctvg") pod "a79f9bd1-bbc7-4506-b585-7f152b5f73f6" (UID: "a79f9bd1-bbc7-4506-b585-7f152b5f73f6"). InnerVolumeSpecName "kube-api-access-nctvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.139814 4773 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-config-volume\") on node \"crc\" DevicePath \"\"" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.139844 4773 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.139854 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nctvg\" (UniqueName: \"kubernetes.io/projected/a79f9bd1-bbc7-4506-b585-7f152b5f73f6-kube-api-access-nctvg\") on node \"crc\" DevicePath \"\"" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.630865 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" event={"ID":"a79f9bd1-bbc7-4506-b585-7f152b5f73f6","Type":"ContainerDied","Data":"3135d289fe27a5514b5fed26d3f258e738419a2758c2552f386fe8f9e5775839"} Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.631275 4773 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3135d289fe27a5514b5fed26d3f258e738419a2758c2552f386fe8f9e5775839" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.630914 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29482290-wn7c9" Jan 20 19:30:03 crc kubenswrapper[4773]: I0120 19:30:03.995720 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8"] Jan 20 19:30:04 crc kubenswrapper[4773]: I0120 19:30:04.004303 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29482245-g5jp8"] Jan 20 19:30:05 crc kubenswrapper[4773]: I0120 19:30:05.462776 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a10b40f1-a7af-4ef6-ac5d-104e09a494d9" path="/var/lib/kubelet/pods/a10b40f1-a7af-4ef6-ac5d-104e09a494d9/volumes" Jan 20 19:30:16 crc kubenswrapper[4773]: I0120 19:30:16.034178 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-create-5vk9g"] Jan 20 19:30:16 crc kubenswrapper[4773]: I0120 19:30:16.052270 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-c4ba-account-create-update-47gmh"] Jan 20 19:30:16 crc kubenswrapper[4773]: I0120 19:30:16.064557 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-c4ba-account-create-update-47gmh"] Jan 20 19:30:16 crc kubenswrapper[4773]: I0120 19:30:16.074643 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-create-5vk9g"] Jan 20 19:30:17 crc kubenswrapper[4773]: I0120 19:30:17.466528 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6" path="/var/lib/kubelet/pods/33fb43c4-3e2a-4034-b8aa-27c2a7e4acb6/volumes" Jan 20 19:30:17 crc kubenswrapper[4773]: I0120 19:30:17.467773 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80285eae-2998-47ab-bcd6-e9905e2e71d4" path="/var/lib/kubelet/pods/80285eae-2998-47ab-bcd6-e9905e2e71d4/volumes" Jan 20 19:30:21 crc kubenswrapper[4773]: I0120 19:30:21.009426 4773 scope.go:117] "RemoveContainer" containerID="572888ce06611feefec987174ae3eee5c299fc9038199817efe0a94604e5aae9" Jan 20 19:30:21 crc kubenswrapper[4773]: I0120 19:30:21.039338 4773 scope.go:117] "RemoveContainer" containerID="0827af644398a715247b27083b551541f42dae2b0a3150620bd9104ee37e5138" Jan 20 19:30:21 crc kubenswrapper[4773]: I0120 19:30:21.094112 4773 scope.go:117] "RemoveContainer" containerID="5831be469a4fe2b76e1bccd6344f54cbf800b2124dcc48460a5c9ae662bb240a" Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.171226 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.171881 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.171986 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.172880 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.173044 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" gracePeriod=600 Jan 20 19:30:28 crc kubenswrapper[4773]: E0120 19:30:28.296613 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.844889 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" exitCode=0 Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.844980 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d"} Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.845052 4773 scope.go:117] "RemoveContainer" containerID="f6b3b5c728dd60bac57f1335c6421f37527b148c607e8336779f2aa1fd55ae02" Jan 20 19:30:28 crc kubenswrapper[4773]: I0120 19:30:28.845757 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:30:28 crc kubenswrapper[4773]: E0120 19:30:28.846096 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:30:36 crc kubenswrapper[4773]: I0120 19:30:36.060148 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/manila-db-sync-r2zvz"] Jan 20 19:30:36 crc kubenswrapper[4773]: I0120 19:30:36.069997 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/manila-db-sync-r2zvz"] Jan 20 19:30:37 crc kubenswrapper[4773]: I0120 19:30:37.479682 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32b245ce-84e1-4fbc-adef-ebfdd1e88d77" path="/var/lib/kubelet/pods/32b245ce-84e1-4fbc-adef-ebfdd1e88d77/volumes" Jan 20 19:30:42 crc kubenswrapper[4773]: I0120 19:30:42.448125 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:30:42 crc kubenswrapper[4773]: E0120 19:30:42.448972 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:30:54 crc kubenswrapper[4773]: I0120 19:30:54.447877 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:30:54 crc kubenswrapper[4773]: E0120 19:30:54.449214 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:31:07 crc kubenswrapper[4773]: I0120 19:31:07.452460 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:31:07 crc kubenswrapper[4773]: E0120 19:31:07.453241 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:31:21 crc kubenswrapper[4773]: I0120 19:31:21.223731 4773 scope.go:117] "RemoveContainer" containerID="4557e82493f1d248bc3cabcdf01505516b90cb0e92c1b3e5eff438a70402a241" Jan 20 19:31:21 crc kubenswrapper[4773]: I0120 19:31:21.447165 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:31:21 crc kubenswrapper[4773]: E0120 19:31:21.447600 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:31:33 crc kubenswrapper[4773]: I0120 19:31:33.447560 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:31:33 crc kubenswrapper[4773]: E0120 19:31:33.448337 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:31:45 crc kubenswrapper[4773]: I0120 19:31:45.450962 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:31:45 crc kubenswrapper[4773]: E0120 19:31:45.452744 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:31:57 crc kubenswrapper[4773]: I0120 19:31:57.459187 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:31:57 crc kubenswrapper[4773]: E0120 19:31:57.460056 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:32:11 crc kubenswrapper[4773]: I0120 19:32:11.451634 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:32:11 crc kubenswrapper[4773]: E0120 19:32:11.453672 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:32:24 crc kubenswrapper[4773]: I0120 19:32:24.447494 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:32:24 crc kubenswrapper[4773]: E0120 19:32:24.448289 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:32:38 crc kubenswrapper[4773]: I0120 19:32:38.446804 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:32:38 crc kubenswrapper[4773]: E0120 19:32:38.447645 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:32:52 crc kubenswrapper[4773]: I0120 19:32:52.447428 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:32:52 crc kubenswrapper[4773]: E0120 19:32:52.448550 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:33:04 crc kubenswrapper[4773]: I0120 19:33:04.448600 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:33:04 crc kubenswrapper[4773]: E0120 19:33:04.450315 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:33:15 crc kubenswrapper[4773]: I0120 19:33:15.452481 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:33:15 crc kubenswrapper[4773]: E0120 19:33:15.453170 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:33:21 crc kubenswrapper[4773]: I0120 19:33:21.299407 4773 scope.go:117] "RemoveContainer" containerID="850d3bec4e1a476b5a9d345935605675d1c6c1592f339240f8819f6c39afef82" Jan 20 19:33:30 crc kubenswrapper[4773]: I0120 19:33:30.449082 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:33:30 crc kubenswrapper[4773]: E0120 19:33:30.450543 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:33:44 crc kubenswrapper[4773]: I0120 19:33:44.448110 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:33:44 crc kubenswrapper[4773]: E0120 19:33:44.448853 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:33:59 crc kubenswrapper[4773]: I0120 19:33:59.448290 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:33:59 crc kubenswrapper[4773]: E0120 19:33:59.449240 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:34:12 crc kubenswrapper[4773]: I0120 19:34:12.447426 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:34:12 crc kubenswrapper[4773]: E0120 19:34:12.448263 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:34:23 crc kubenswrapper[4773]: I0120 19:34:23.451725 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:34:23 crc kubenswrapper[4773]: E0120 19:34:23.452704 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:34:38 crc kubenswrapper[4773]: I0120 19:34:38.448614 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:34:38 crc kubenswrapper[4773]: E0120 19:34:38.449563 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:34:52 crc kubenswrapper[4773]: I0120 19:34:52.447331 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:34:52 crc kubenswrapper[4773]: E0120 19:34:52.448110 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:35:03 crc kubenswrapper[4773]: I0120 19:35:03.447128 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:35:03 crc kubenswrapper[4773]: E0120 19:35:03.447912 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.576320 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-qp7pd"] Jan 20 19:35:13 crc kubenswrapper[4773]: E0120 19:35:13.577223 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a79f9bd1-bbc7-4506-b585-7f152b5f73f6" containerName="collect-profiles" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.577235 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="a79f9bd1-bbc7-4506-b585-7f152b5f73f6" containerName="collect-profiles" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.578273 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="a79f9bd1-bbc7-4506-b585-7f152b5f73f6" containerName="collect-profiles" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.585531 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.622057 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qp7pd"] Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.722793 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-catalog-content\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.723181 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-utilities\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.723376 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkhxv\" (UniqueName: \"kubernetes.io/projected/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-kube-api-access-fkhxv\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.825215 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-catalog-content\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.825281 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-utilities\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.825360 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkhxv\" (UniqueName: \"kubernetes.io/projected/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-kube-api-access-fkhxv\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.825898 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-utilities\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.825978 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-catalog-content\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.843880 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkhxv\" (UniqueName: \"kubernetes.io/projected/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-kube-api-access-fkhxv\") pod \"certified-operators-qp7pd\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:13 crc kubenswrapper[4773]: I0120 19:35:13.927552 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:14 crc kubenswrapper[4773]: I0120 19:35:14.427823 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-qp7pd"] Jan 20 19:35:15 crc kubenswrapper[4773]: I0120 19:35:15.395109 4773 generic.go:334] "Generic (PLEG): container finished" podID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerID="dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7" exitCode=0 Jan 20 19:35:15 crc kubenswrapper[4773]: I0120 19:35:15.395469 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp7pd" event={"ID":"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d","Type":"ContainerDied","Data":"dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7"} Jan 20 19:35:15 crc kubenswrapper[4773]: I0120 19:35:15.395497 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp7pd" event={"ID":"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d","Type":"ContainerStarted","Data":"7411dc2283de4bdc0a3861974c303bfbfd255e4c35e2114ad37c24e4e22985e5"} Jan 20 19:35:15 crc kubenswrapper[4773]: I0120 19:35:15.397398 4773 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 20 19:35:16 crc kubenswrapper[4773]: I0120 19:35:16.943198 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-dp4ts"] Jan 20 19:35:16 crc kubenswrapper[4773]: I0120 19:35:16.945395 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:16 crc kubenswrapper[4773]: I0120 19:35:16.972206 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dp4ts"] Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.117146 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48vvw\" (UniqueName: \"kubernetes.io/projected/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-kube-api-access-48vvw\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.117552 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-catalog-content\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.117627 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-utilities\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.219463 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-utilities\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.219652 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48vvw\" (UniqueName: \"kubernetes.io/projected/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-kube-api-access-48vvw\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.219704 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-catalog-content\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.219958 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-utilities\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.220194 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-catalog-content\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.249831 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48vvw\" (UniqueName: \"kubernetes.io/projected/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-kube-api-access-48vvw\") pod \"redhat-operators-dp4ts\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.284375 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.464156 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:35:17 crc kubenswrapper[4773]: E0120 19:35:17.465161 4773 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-sq4x7_openshift-machine-config-operator(1ddd934f-f012-4083-b5e6-b99711071621)\"" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.764719 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-dp4ts"] Jan 20 19:35:17 crc kubenswrapper[4773]: W0120 19:35:17.776248 4773 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf0ac612_5a13_44eb_942e_52b7fa9a9c2f.slice/crio-4452faa1f02afd5b8103f739c37459ef156f0df638830012fb688d0aa82c94ac WatchSource:0}: Error finding container 4452faa1f02afd5b8103f739c37459ef156f0df638830012fb688d0aa82c94ac: Status 404 returned error can't find the container with id 4452faa1f02afd5b8103f739c37459ef156f0df638830012fb688d0aa82c94ac Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.951048 4773 generic.go:334] "Generic (PLEG): container finished" podID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerID="ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326" exitCode=0 Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.951222 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp7pd" event={"ID":"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d","Type":"ContainerDied","Data":"ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326"} Jan 20 19:35:17 crc kubenswrapper[4773]: I0120 19:35:17.959641 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4ts" event={"ID":"df0ac612-5a13-44eb-942e-52b7fa9a9c2f","Type":"ContainerStarted","Data":"4452faa1f02afd5b8103f739c37459ef156f0df638830012fb688d0aa82c94ac"} Jan 20 19:35:18 crc kubenswrapper[4773]: I0120 19:35:18.967514 4773 generic.go:334] "Generic (PLEG): container finished" podID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerID="0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b" exitCode=0 Jan 20 19:35:18 crc kubenswrapper[4773]: I0120 19:35:18.967576 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4ts" event={"ID":"df0ac612-5a13-44eb-942e-52b7fa9a9c2f","Type":"ContainerDied","Data":"0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b"} Jan 20 19:35:18 crc kubenswrapper[4773]: I0120 19:35:18.971156 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp7pd" event={"ID":"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d","Type":"ContainerStarted","Data":"9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229"} Jan 20 19:35:19 crc kubenswrapper[4773]: I0120 19:35:19.017740 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-qp7pd" podStartSLOduration=2.983356832 podStartE2EDuration="6.017674644s" podCreationTimestamp="2026-01-20 19:35:13 +0000 UTC" firstStartedPulling="2026-01-20 19:35:15.396951585 +0000 UTC m=+3908.318764609" lastFinishedPulling="2026-01-20 19:35:18.431269397 +0000 UTC m=+3911.353082421" observedRunningTime="2026-01-20 19:35:19.012569622 +0000 UTC m=+3911.934382646" watchObservedRunningTime="2026-01-20 19:35:19.017674644 +0000 UTC m=+3911.939487688" Jan 20 19:35:20 crc kubenswrapper[4773]: I0120 19:35:20.989996 4773 generic.go:334] "Generic (PLEG): container finished" podID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerID="7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c" exitCode=0 Jan 20 19:35:20 crc kubenswrapper[4773]: I0120 19:35:20.990126 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4ts" event={"ID":"df0ac612-5a13-44eb-942e-52b7fa9a9c2f","Type":"ContainerDied","Data":"7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c"} Jan 20 19:35:22 crc kubenswrapper[4773]: I0120 19:35:22.000879 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4ts" event={"ID":"df0ac612-5a13-44eb-942e-52b7fa9a9c2f","Type":"ContainerStarted","Data":"e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578"} Jan 20 19:35:22 crc kubenswrapper[4773]: I0120 19:35:22.033411 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-dp4ts" podStartSLOduration=3.5885728070000003 podStartE2EDuration="6.033387478s" podCreationTimestamp="2026-01-20 19:35:16 +0000 UTC" firstStartedPulling="2026-01-20 19:35:18.96939571 +0000 UTC m=+3911.891208734" lastFinishedPulling="2026-01-20 19:35:21.414210381 +0000 UTC m=+3914.336023405" observedRunningTime="2026-01-20 19:35:22.027795363 +0000 UTC m=+3914.949608387" watchObservedRunningTime="2026-01-20 19:35:22.033387478 +0000 UTC m=+3914.955200502" Jan 20 19:35:23 crc kubenswrapper[4773]: I0120 19:35:23.927689 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:23 crc kubenswrapper[4773]: I0120 19:35:23.929068 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:23 crc kubenswrapper[4773]: I0120 19:35:23.978516 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:24 crc kubenswrapper[4773]: I0120 19:35:24.059317 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:27 crc kubenswrapper[4773]: I0120 19:35:27.284500 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:27 crc kubenswrapper[4773]: I0120 19:35:27.286123 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:27 crc kubenswrapper[4773]: I0120 19:35:27.330426 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:28 crc kubenswrapper[4773]: I0120 19:35:28.089955 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:28 crc kubenswrapper[4773]: I0120 19:35:28.337969 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qp7pd"] Jan 20 19:35:28 crc kubenswrapper[4773]: I0120 19:35:28.338197 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-qp7pd" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="registry-server" containerID="cri-o://9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229" gracePeriod=2 Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.649867 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.699063 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-catalog-content\") pod \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.699240 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-utilities\") pod \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.699272 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkhxv\" (UniqueName: \"kubernetes.io/projected/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-kube-api-access-fkhxv\") pod \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\" (UID: \"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d\") " Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.700049 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-utilities" (OuterVolumeSpecName: "utilities") pod "4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" (UID: "4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.705311 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-kube-api-access-fkhxv" (OuterVolumeSpecName: "kube-api-access-fkhxv") pod "4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" (UID: "4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d"). InnerVolumeSpecName "kube-api-access-fkhxv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.742539 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" (UID: "4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.803006 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.803485 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:35:30 crc kubenswrapper[4773]: I0120 19:35:30.803498 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkhxv\" (UniqueName: \"kubernetes.io/projected/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d-kube-api-access-fkhxv\") on node \"crc\" DevicePath \"\"" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.095122 4773 generic.go:334] "Generic (PLEG): container finished" podID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerID="9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229" exitCode=0 Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.095184 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp7pd" event={"ID":"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d","Type":"ContainerDied","Data":"9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229"} Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.095221 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-qp7pd" event={"ID":"4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d","Type":"ContainerDied","Data":"7411dc2283de4bdc0a3861974c303bfbfd255e4c35e2114ad37c24e4e22985e5"} Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.095244 4773 scope.go:117] "RemoveContainer" containerID="9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.095280 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-qp7pd" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.132369 4773 scope.go:117] "RemoveContainer" containerID="ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.134686 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-qp7pd"] Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.142677 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-qp7pd"] Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.170973 4773 scope.go:117] "RemoveContainer" containerID="dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.175256 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dp4ts"] Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.175574 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-dp4ts" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="registry-server" containerID="cri-o://e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578" gracePeriod=2 Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.215028 4773 scope.go:117] "RemoveContainer" containerID="9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229" Jan 20 19:35:31 crc kubenswrapper[4773]: E0120 19:35:31.216866 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229\": container with ID starting with 9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229 not found: ID does not exist" containerID="9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.216917 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229"} err="failed to get container status \"9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229\": rpc error: code = NotFound desc = could not find container \"9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229\": container with ID starting with 9bf1ee2a5bfc139ae5d04c01fe978c6b195b9c7edf12f7f52de1bd69c23d8229 not found: ID does not exist" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.217017 4773 scope.go:117] "RemoveContainer" containerID="ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326" Jan 20 19:35:31 crc kubenswrapper[4773]: E0120 19:35:31.217888 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326\": container with ID starting with ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326 not found: ID does not exist" containerID="ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.217917 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326"} err="failed to get container status \"ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326\": rpc error: code = NotFound desc = could not find container \"ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326\": container with ID starting with ee47d143c37695c34a3387ac7d9321a8b5ac45b9879181de6be66de940f67326 not found: ID does not exist" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.217954 4773 scope.go:117] "RemoveContainer" containerID="dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7" Jan 20 19:35:31 crc kubenswrapper[4773]: E0120 19:35:31.220067 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7\": container with ID starting with dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7 not found: ID does not exist" containerID="dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.220109 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7"} err="failed to get container status \"dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7\": rpc error: code = NotFound desc = could not find container \"dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7\": container with ID starting with dfedefb1f2a6d344ae9a92e11b99db9b59ace9f07351e6abd3f7360019095be7 not found: ID does not exist" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.466348 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" path="/var/lib/kubelet/pods/4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d/volumes" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.664361 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.738508 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-catalog-content\") pod \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.738642 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48vvw\" (UniqueName: \"kubernetes.io/projected/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-kube-api-access-48vvw\") pod \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.739541 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-utilities\") pod \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\" (UID: \"df0ac612-5a13-44eb-942e-52b7fa9a9c2f\") " Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.740940 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-utilities" (OuterVolumeSpecName: "utilities") pod "df0ac612-5a13-44eb-942e-52b7fa9a9c2f" (UID: "df0ac612-5a13-44eb-942e-52b7fa9a9c2f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.743990 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-kube-api-access-48vvw" (OuterVolumeSpecName: "kube-api-access-48vvw") pod "df0ac612-5a13-44eb-942e-52b7fa9a9c2f" (UID: "df0ac612-5a13-44eb-942e-52b7fa9a9c2f"). InnerVolumeSpecName "kube-api-access-48vvw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.841581 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48vvw\" (UniqueName: \"kubernetes.io/projected/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-kube-api-access-48vvw\") on node \"crc\" DevicePath \"\"" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.841621 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.855008 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "df0ac612-5a13-44eb-942e-52b7fa9a9c2f" (UID: "df0ac612-5a13-44eb-942e-52b7fa9a9c2f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:35:31 crc kubenswrapper[4773]: I0120 19:35:31.943226 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df0ac612-5a13-44eb-942e-52b7fa9a9c2f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.106605 4773 generic.go:334] "Generic (PLEG): container finished" podID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerID="e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578" exitCode=0 Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.106682 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4ts" event={"ID":"df0ac612-5a13-44eb-942e-52b7fa9a9c2f","Type":"ContainerDied","Data":"e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578"} Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.106986 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-dp4ts" event={"ID":"df0ac612-5a13-44eb-942e-52b7fa9a9c2f","Type":"ContainerDied","Data":"4452faa1f02afd5b8103f739c37459ef156f0df638830012fb688d0aa82c94ac"} Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.106719 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-dp4ts" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.107008 4773 scope.go:117] "RemoveContainer" containerID="e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.131514 4773 scope.go:117] "RemoveContainer" containerID="7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.140023 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-dp4ts"] Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.148671 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-dp4ts"] Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.177006 4773 scope.go:117] "RemoveContainer" containerID="0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.216966 4773 scope.go:117] "RemoveContainer" containerID="e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578" Jan 20 19:35:32 crc kubenswrapper[4773]: E0120 19:35:32.217487 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578\": container with ID starting with e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578 not found: ID does not exist" containerID="e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.217519 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578"} err="failed to get container status \"e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578\": rpc error: code = NotFound desc = could not find container \"e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578\": container with ID starting with e6b267982ba2d17da833cbb3d02ae946b942a2776395ed2f158c758672c69578 not found: ID does not exist" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.217542 4773 scope.go:117] "RemoveContainer" containerID="7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c" Jan 20 19:35:32 crc kubenswrapper[4773]: E0120 19:35:32.217947 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c\": container with ID starting with 7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c not found: ID does not exist" containerID="7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.217995 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c"} err="failed to get container status \"7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c\": rpc error: code = NotFound desc = could not find container \"7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c\": container with ID starting with 7123892ffd2df6da370c6bb3ca2780bba5a6f91ab5039669f36b9a1563a6267c not found: ID does not exist" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.218021 4773 scope.go:117] "RemoveContainer" containerID="0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b" Jan 20 19:35:32 crc kubenswrapper[4773]: E0120 19:35:32.218481 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b\": container with ID starting with 0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b not found: ID does not exist" containerID="0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.218506 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b"} err="failed to get container status \"0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b\": rpc error: code = NotFound desc = could not find container \"0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b\": container with ID starting with 0c1ea80a6f8b107d807343262bda371fa48f5aea4dfa0633e57c9d2efce9051b not found: ID does not exist" Jan 20 19:35:32 crc kubenswrapper[4773]: I0120 19:35:32.447437 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d" Jan 20 19:35:33 crc kubenswrapper[4773]: I0120 19:35:33.119977 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"7dcc60451ddecf419e7915aabde6a72a57b21629b8930630e02b7a298f4e2162"} Jan 20 19:35:33 crc kubenswrapper[4773]: I0120 19:35:33.457806 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" path="/var/lib/kubelet/pods/df0ac612-5a13-44eb-942e-52b7fa9a9c2f/volumes" Jan 20 19:35:35 crc kubenswrapper[4773]: E0120 19:35:35.470981 4773 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.39:45582->38.102.83.39:34695: write tcp 38.102.83.39:45582->38.102.83.39:34695: write: connection reset by peer Jan 20 19:36:26 crc kubenswrapper[4773]: I0120 19:36:26.565380 4773 generic.go:334] "Generic (PLEG): container finished" podID="ae725e5b-de4d-443b-bd8c-985abdcb0f87" containerID="2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124" exitCode=0 Jan 20 19:36:26 crc kubenswrapper[4773]: I0120 19:36:26.565428 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-mzdmf/must-gather-lp22t" event={"ID":"ae725e5b-de4d-443b-bd8c-985abdcb0f87","Type":"ContainerDied","Data":"2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124"} Jan 20 19:36:26 crc kubenswrapper[4773]: I0120 19:36:26.568089 4773 scope.go:117] "RemoveContainer" containerID="2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.283159 4773 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9nkd6"] Jan 20 19:36:27 crc kubenswrapper[4773]: E0120 19:36:27.283791 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="registry-server" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.283860 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="registry-server" Jan 20 19:36:27 crc kubenswrapper[4773]: E0120 19:36:27.283925 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="extract-content" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284003 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="extract-content" Jan 20 19:36:27 crc kubenswrapper[4773]: E0120 19:36:27.284076 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="registry-server" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284129 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="registry-server" Jan 20 19:36:27 crc kubenswrapper[4773]: E0120 19:36:27.284183 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="extract-utilities" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284236 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="extract-utilities" Jan 20 19:36:27 crc kubenswrapper[4773]: E0120 19:36:27.284289 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="extract-content" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284339 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="extract-content" Jan 20 19:36:27 crc kubenswrapper[4773]: E0120 19:36:27.284400 4773 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="extract-utilities" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284475 4773 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="extract-utilities" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284692 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a43a4c2-7d24-4a4b-909b-d76ce51ffb5d" containerName="registry-server" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.284764 4773 memory_manager.go:354] "RemoveStaleState removing state" podUID="df0ac612-5a13-44eb-942e-52b7fa9a9c2f" containerName="registry-server" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.286138 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.297397 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nkd6"] Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.427994 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mzdmf_must-gather-lp22t_ae725e5b-de4d-443b-bd8c-985abdcb0f87/gather/0.log" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.457332 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-utilities\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.457867 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-catalog-content\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.458343 4773 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8dgwq\" (UniqueName: \"kubernetes.io/projected/0866c773-75be-4641-b2d6-74b9944abe6d-kube-api-access-8dgwq\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.560542 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8dgwq\" (UniqueName: \"kubernetes.io/projected/0866c773-75be-4641-b2d6-74b9944abe6d-kube-api-access-8dgwq\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.561075 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-utilities\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.561625 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-utilities\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.561771 4773 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-catalog-content\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.562181 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-catalog-content\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.582584 4773 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8dgwq\" (UniqueName: \"kubernetes.io/projected/0866c773-75be-4641-b2d6-74b9944abe6d-kube-api-access-8dgwq\") pod \"redhat-marketplace-9nkd6\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:27 crc kubenswrapper[4773]: I0120 19:36:27.661371 4773 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:28 crc kubenswrapper[4773]: I0120 19:36:28.167556 4773 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nkd6"] Jan 20 19:36:28 crc kubenswrapper[4773]: I0120 19:36:28.584486 4773 generic.go:334] "Generic (PLEG): container finished" podID="0866c773-75be-4641-b2d6-74b9944abe6d" containerID="a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5" exitCode=0 Jan 20 19:36:28 crc kubenswrapper[4773]: I0120 19:36:28.584639 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nkd6" event={"ID":"0866c773-75be-4641-b2d6-74b9944abe6d","Type":"ContainerDied","Data":"a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5"} Jan 20 19:36:28 crc kubenswrapper[4773]: I0120 19:36:28.584764 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nkd6" event={"ID":"0866c773-75be-4641-b2d6-74b9944abe6d","Type":"ContainerStarted","Data":"09c45e26a48800d21ad8108a306a81ecbff994d1af4c00904479b9c9116d1432"} Jan 20 19:36:30 crc kubenswrapper[4773]: I0120 19:36:30.608697 4773 generic.go:334] "Generic (PLEG): container finished" podID="0866c773-75be-4641-b2d6-74b9944abe6d" containerID="3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8" exitCode=0 Jan 20 19:36:30 crc kubenswrapper[4773]: I0120 19:36:30.608759 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nkd6" event={"ID":"0866c773-75be-4641-b2d6-74b9944abe6d","Type":"ContainerDied","Data":"3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8"} Jan 20 19:36:31 crc kubenswrapper[4773]: I0120 19:36:31.652313 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nkd6" event={"ID":"0866c773-75be-4641-b2d6-74b9944abe6d","Type":"ContainerStarted","Data":"c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e"} Jan 20 19:36:31 crc kubenswrapper[4773]: I0120 19:36:31.673892 4773 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9nkd6" podStartSLOduration=2.060796981 podStartE2EDuration="4.673868518s" podCreationTimestamp="2026-01-20 19:36:27 +0000 UTC" firstStartedPulling="2026-01-20 19:36:28.586492778 +0000 UTC m=+3981.508305802" lastFinishedPulling="2026-01-20 19:36:31.199564315 +0000 UTC m=+3984.121377339" observedRunningTime="2026-01-20 19:36:31.668580481 +0000 UTC m=+3984.590393505" watchObservedRunningTime="2026-01-20 19:36:31.673868518 +0000 UTC m=+3984.595681542" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.046300 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-mzdmf/must-gather-lp22t"] Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.047429 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-mzdmf/must-gather-lp22t" podUID="ae725e5b-de4d-443b-bd8c-985abdcb0f87" containerName="copy" containerID="cri-o://1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95" gracePeriod=2 Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.058039 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-mzdmf/must-gather-lp22t"] Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.497176 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mzdmf_must-gather-lp22t_ae725e5b-de4d-443b-bd8c-985abdcb0f87/copy/0.log" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.497944 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.653615 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae725e5b-de4d-443b-bd8c-985abdcb0f87-must-gather-output\") pod \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.653673 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-khbtp\" (UniqueName: \"kubernetes.io/projected/ae725e5b-de4d-443b-bd8c-985abdcb0f87-kube-api-access-khbtp\") pod \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\" (UID: \"ae725e5b-de4d-443b-bd8c-985abdcb0f87\") " Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.663182 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae725e5b-de4d-443b-bd8c-985abdcb0f87-kube-api-access-khbtp" (OuterVolumeSpecName: "kube-api-access-khbtp") pod "ae725e5b-de4d-443b-bd8c-985abdcb0f87" (UID: "ae725e5b-de4d-443b-bd8c-985abdcb0f87"). InnerVolumeSpecName "kube-api-access-khbtp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.690920 4773 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-mzdmf_must-gather-lp22t_ae725e5b-de4d-443b-bd8c-985abdcb0f87/copy/0.log" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.691670 4773 generic.go:334] "Generic (PLEG): container finished" podID="ae725e5b-de4d-443b-bd8c-985abdcb0f87" containerID="1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95" exitCode=143 Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.691735 4773 scope.go:117] "RemoveContainer" containerID="1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.691878 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-mzdmf/must-gather-lp22t" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.712037 4773 scope.go:117] "RemoveContainer" containerID="2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.756451 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-khbtp\" (UniqueName: \"kubernetes.io/projected/ae725e5b-de4d-443b-bd8c-985abdcb0f87-kube-api-access-khbtp\") on node \"crc\" DevicePath \"\"" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.804222 4773 scope.go:117] "RemoveContainer" containerID="1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95" Jan 20 19:36:36 crc kubenswrapper[4773]: E0120 19:36:36.805617 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95\": container with ID starting with 1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95 not found: ID does not exist" containerID="1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.805673 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95"} err="failed to get container status \"1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95\": rpc error: code = NotFound desc = could not find container \"1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95\": container with ID starting with 1169edc90b360255fb7f1e948c61eca241b70cd1e7e780dbf8cfe336a4692b95 not found: ID does not exist" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.805709 4773 scope.go:117] "RemoveContainer" containerID="2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124" Jan 20 19:36:36 crc kubenswrapper[4773]: E0120 19:36:36.806351 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124\": container with ID starting with 2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124 not found: ID does not exist" containerID="2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.806380 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124"} err="failed to get container status \"2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124\": rpc error: code = NotFound desc = could not find container \"2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124\": container with ID starting with 2838c7cb0decc5fc8472fb9e866280ec0ba3b00d1dcfff4d6fbc29ac9ec5f124 not found: ID does not exist" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.858744 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae725e5b-de4d-443b-bd8c-985abdcb0f87-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ae725e5b-de4d-443b-bd8c-985abdcb0f87" (UID: "ae725e5b-de4d-443b-bd8c-985abdcb0f87"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:36:36 crc kubenswrapper[4773]: I0120 19:36:36.960223 4773 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ae725e5b-de4d-443b-bd8c-985abdcb0f87-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 20 19:36:37 crc kubenswrapper[4773]: I0120 19:36:37.457956 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae725e5b-de4d-443b-bd8c-985abdcb0f87" path="/var/lib/kubelet/pods/ae725e5b-de4d-443b-bd8c-985abdcb0f87/volumes" Jan 20 19:36:37 crc kubenswrapper[4773]: I0120 19:36:37.662381 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:37 crc kubenswrapper[4773]: I0120 19:36:37.662447 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:37 crc kubenswrapper[4773]: I0120 19:36:37.709226 4773 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:37 crc kubenswrapper[4773]: I0120 19:36:37.763417 4773 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:37 crc kubenswrapper[4773]: I0120 19:36:37.950847 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nkd6"] Jan 20 19:36:39 crc kubenswrapper[4773]: I0120 19:36:39.716449 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9nkd6" podUID="0866c773-75be-4641-b2d6-74b9944abe6d" containerName="registry-server" containerID="cri-o://c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e" gracePeriod=2 Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.251529 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.319974 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-utilities\") pod \"0866c773-75be-4641-b2d6-74b9944abe6d\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.320301 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-catalog-content\") pod \"0866c773-75be-4641-b2d6-74b9944abe6d\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.320337 4773 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8dgwq\" (UniqueName: \"kubernetes.io/projected/0866c773-75be-4641-b2d6-74b9944abe6d-kube-api-access-8dgwq\") pod \"0866c773-75be-4641-b2d6-74b9944abe6d\" (UID: \"0866c773-75be-4641-b2d6-74b9944abe6d\") " Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.328869 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-utilities" (OuterVolumeSpecName: "utilities") pod "0866c773-75be-4641-b2d6-74b9944abe6d" (UID: "0866c773-75be-4641-b2d6-74b9944abe6d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.332246 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0866c773-75be-4641-b2d6-74b9944abe6d-kube-api-access-8dgwq" (OuterVolumeSpecName: "kube-api-access-8dgwq") pod "0866c773-75be-4641-b2d6-74b9944abe6d" (UID: "0866c773-75be-4641-b2d6-74b9944abe6d"). InnerVolumeSpecName "kube-api-access-8dgwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.360532 4773 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0866c773-75be-4641-b2d6-74b9944abe6d" (UID: "0866c773-75be-4641-b2d6-74b9944abe6d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.423148 4773 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-utilities\") on node \"crc\" DevicePath \"\"" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.423331 4773 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0866c773-75be-4641-b2d6-74b9944abe6d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.423385 4773 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8dgwq\" (UniqueName: \"kubernetes.io/projected/0866c773-75be-4641-b2d6-74b9944abe6d-kube-api-access-8dgwq\") on node \"crc\" DevicePath \"\"" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.727112 4773 generic.go:334] "Generic (PLEG): container finished" podID="0866c773-75be-4641-b2d6-74b9944abe6d" containerID="c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e" exitCode=0 Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.727158 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nkd6" event={"ID":"0866c773-75be-4641-b2d6-74b9944abe6d","Type":"ContainerDied","Data":"c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e"} Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.727186 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9nkd6" event={"ID":"0866c773-75be-4641-b2d6-74b9944abe6d","Type":"ContainerDied","Data":"09c45e26a48800d21ad8108a306a81ecbff994d1af4c00904479b9c9116d1432"} Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.727211 4773 scope.go:117] "RemoveContainer" containerID="c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.727220 4773 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9nkd6" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.754072 4773 scope.go:117] "RemoveContainer" containerID="3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.763283 4773 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nkd6"] Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.771920 4773 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9nkd6"] Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.787195 4773 scope.go:117] "RemoveContainer" containerID="a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.823630 4773 scope.go:117] "RemoveContainer" containerID="c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e" Jan 20 19:36:40 crc kubenswrapper[4773]: E0120 19:36:40.824328 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e\": container with ID starting with c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e not found: ID does not exist" containerID="c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.824375 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e"} err="failed to get container status \"c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e\": rpc error: code = NotFound desc = could not find container \"c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e\": container with ID starting with c263b7a26cd69b5c42e688b3c266d9f20fca80577c9850dde8e2c467743d8e8e not found: ID does not exist" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.824402 4773 scope.go:117] "RemoveContainer" containerID="3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8" Jan 20 19:36:40 crc kubenswrapper[4773]: E0120 19:36:40.824767 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8\": container with ID starting with 3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8 not found: ID does not exist" containerID="3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.824800 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8"} err="failed to get container status \"3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8\": rpc error: code = NotFound desc = could not find container \"3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8\": container with ID starting with 3f63a8ad512b3a71cea688446b12b8b263039e3e81e4869c8c560abee28eb2d8 not found: ID does not exist" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.824821 4773 scope.go:117] "RemoveContainer" containerID="a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5" Jan 20 19:36:40 crc kubenswrapper[4773]: E0120 19:36:40.825339 4773 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5\": container with ID starting with a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5 not found: ID does not exist" containerID="a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5" Jan 20 19:36:40 crc kubenswrapper[4773]: I0120 19:36:40.825379 4773 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5"} err="failed to get container status \"a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5\": rpc error: code = NotFound desc = could not find container \"a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5\": container with ID starting with a3c94f222036723845053dc89a7a4a5061b377fb9543dff77c4876be54325bf5 not found: ID does not exist" Jan 20 19:36:41 crc kubenswrapper[4773]: I0120 19:36:41.458254 4773 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0866c773-75be-4641-b2d6-74b9944abe6d" path="/var/lib/kubelet/pods/0866c773-75be-4641-b2d6-74b9944abe6d/volumes" Jan 20 19:37:58 crc kubenswrapper[4773]: I0120 19:37:58.169989 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:37:58 crc kubenswrapper[4773]: I0120 19:37:58.170547 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:38:28 crc kubenswrapper[4773]: I0120 19:38:28.170387 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:38:28 crc kubenswrapper[4773]: I0120 19:38:28.171104 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.170643 4773 patch_prober.go:28] interesting pod/machine-config-daemon-sq4x7 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.171156 4773 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.171198 4773 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.171847 4773 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7dcc60451ddecf419e7915aabde6a72a57b21629b8930630e02b7a298f4e2162"} pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.171891 4773 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" podUID="1ddd934f-f012-4083-b5e6-b99711071621" containerName="machine-config-daemon" containerID="cri-o://7dcc60451ddecf419e7915aabde6a72a57b21629b8930630e02b7a298f4e2162" gracePeriod=600 Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.993685 4773 generic.go:334] "Generic (PLEG): container finished" podID="1ddd934f-f012-4083-b5e6-b99711071621" containerID="7dcc60451ddecf419e7915aabde6a72a57b21629b8930630e02b7a298f4e2162" exitCode=0 Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.994004 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerDied","Data":"7dcc60451ddecf419e7915aabde6a72a57b21629b8930630e02b7a298f4e2162"} Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.994035 4773 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-sq4x7" event={"ID":"1ddd934f-f012-4083-b5e6-b99711071621","Type":"ContainerStarted","Data":"73d06ef6f46d1d40d9ec469befd531111e1aaac92931f7dec4b5155df844c18a"} Jan 20 19:38:58 crc kubenswrapper[4773]: I0120 19:38:58.994053 4773 scope.go:117] "RemoveContainer" containerID="66d18e7e7c4332943cab7b2f123da4a39ef6b1eb85671f155f90330f5704072d"